• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-12900KS

But you are just WRONG. They are not raising the power usage to achieve better performance. Of course it is all rumours at this point, but the 4090 is supposedly almost twice as fast as the 3090. So yeah, a 20% consumption increase for a 100% performance increase is an insane efficiency jump. I don't understand how you do not get this.
1. I would rather not base anything I say on rumours. The 30-series was said to be supper efficient before launch, and look how it turned out. There isn't a single card without an 8-pin power connector in the whole lineup all the way down to the 3050, and power to performance ratios are just on par with similar Turing chips.

2. The post you commented on said that there should be a line drawn to how much power any graphics card is allowed to consume. The 4090's rumoured 450 W is way above that line. A car that has 1000 HP and does 10 miles per gallon is more efficient than one that has 150 HP and does 30 mpg, but do you really want to fuel a car that only does 10 mpg? If so, enjoy, but most people don't.

Edit: To stay on topic, Alder Lake is said to be super efficient too, but if that efficiency comes at 200+ Watts, I couldn't care less.
 
Last edited:
100% improvement after one generation is never going to happen (or: never again, if you were referring to some cases from the late 1990s-early 2000s)
RDNA 3 is going to be 100% faster in raster than RDNA 2 and more % in ray-tracing (albeit at higher TDP). After we get out of current stagnation, higher growth will come back. We are currently in a temporary trough caused by still using 2D silicon electronic circuits. Once we move to 3D photonic circuits from different materials (around 2030), like graphene or molybdenum disulfide, we will once again see greater gains in efficiency. Gamers should also voice their opinions that current slow growth is unacceptable. Low gigahertz CPUs and low teraflops GPUs are extremely slow, we just only at the beginning of the evolution of computing and great things lie ahead, we've seen nothing yet. i9-12900KS is just a bad joke, it's no better than a normal 12900K and no one should buy one.
 
I'm more radical than that - my line (currently) is at 250 W. My Seasonic Prime Ultra Platinum 550 W is the best PSU I've ever had and it's still well within warranty thanks to Seasonic's amazing warranty policy (10 years for Focus, 12 for Prime). It wasn't too cheap, either, so I'd rather not buy another one just because Intel, nvidia and AMD decided to go balls to the walls with performance. 200 W for CPU, 250 W for GPU and 100 W for everything else, including a little safety headroom should be enough.

I'm quite radical with PC size, too. I love mini-ITX systems, even though I have a micro-ATX one at the moment, which is the biggest size I'm happy with. Full towers with lots of unused expansion slots are a thing of the past, imo. I also don't like how they look. I know, one needs the space for airflow, but all the void inside makes the case look empty and bigger than it really should be - kind of the same way people buy SUVs for going to the supermarket once a week. My PL-unlocked i7-11700 and RTX 2070 are kind of the maximum of what I can comfortably cool in this size.
Well that is a matter of perspective 250W vs 400W sure it is a difference but like some people suggest going for 600W or higher (which if we keep this course of action will happen) is ridiculous and saying, you dont have to buy it is just plain stupid. Also saying, you can downclock it and save power is even more ridiculous in my opinion since you pay for performance of the card. So you will pay shit a lot to get 250W obviously slower card. Not good.
I have a 6900XT and you may argue about the power it uses but the fact is, it doesnt use a lot. I really dont go across 260W ever. It's not OC'ed but it doesn't have to be either.
But you are just WRONG. They are not raising the power usage to achieve better performance. Of course it is all rumours at this point, but the 4090 is supposedly almost twice as fast as the 3090. So yeah, a 20% consumption increase for a 100% performance increase is an insane efficiency jump. I don't understand how you do not get this.
They are raising the power usage. You said it yourself 4090 will use 600watts. How is that not raising power consumption for the cards? It will be faster no doubt but the power for a card is almost 3 times higher than it used to be.
Yes it is almost twice the 3090 and each of those cards had power draw up. Each generation the power draw goes up. Are you blind or something?
1080 Ti card only power draw 231W
2080 Ti card only power draw 273W
3080 Ti card only power draw 356W
All gaming only.
Don't even try to tell me it's just the way it is or it performs better because that is simple bullshit. Power goes up despite how fast it is and now you will get 4090 with power of 600w. Get 4080 TI with power 550W that is terrible no matter the performance. The power goes up for all the cards and that is illustrated clearly by TPU's reviews. each gen power draw for the same card segment goes up.
4090 600w twice performance of a 3090 350w still shit a lot of power consumption for both cards. NV has been upping the power needed for the cards exponentially every gen. Obviously the 3090 TI uses even more so then they will compare 4090 to a 3090 Ti with a power draw of 450 and it will not seem so bad. But if you look across generations of cards it sucks so bad.

Edit: To stay on topic, Alder Lake is said to be super efficient too, but if that efficiency comes at 200+ Watts, I couldn't care less.
Some people are just blind to see this. You can't boost the power over and over just because the efficiency is there. That just a dead end and for the companies they are slacking in delivering a good round product.
 
Last edited:
Is it ironic that the best big.LITTLE approach would be a quad core intel with an 8 core ryzen?
 
Is it ironic that the best big.LITTLE approach would be a quad core intel with an 8 core ryzen?
Disaggregation at it's finest.
 
Perfect CPU to pair with a 3090 Ti...

...for the next 6 months, when you will have to swap both components because you always need to have the best of the best.
 
On the gaming testing front is there any chance that some none FPS metrics could be incorporated next time the suite gets an overhaul? Stuff like AI turn time for civ 6 (or 7 if released), late game tic rates (or an abstraction) for the Paradox Grand Strategy games like CK3, Stellaris, EU4. Late game city builder tic rates and so on.

While the 720p results give an indication finding a way to actually benchmark those games where the CPU is far more important for the gameplay experience than the GPU would be a great addition to the CPU test suite.

For most of the games above my 2200G could do 4k at playable frame rates. The issues that part has (and still does) is late game hitching when calculating, turn time and the fact that late game 'fastest' game speed often becomes slower than early game 'normal' making it drag. FPS is rarely an issue worth complaining about.

As for this CPU. Does not seem worth it over the 12900K. Barely any extra performance for a pretty hefty price increase.

Agree on turn times, also lag (and frequency) on shader and texture streaming needs to get implemented, granted its a headache to implement, but times are moving. I think TPU should be a leader on that front.
 
But you are just WRONG. They are not raising the power usage to achieve better performance. Of course it is all rumours at this point, but the 4090 is supposedly almost twice as fast as the 3090. So yeah, a 20% consumption increase for a 100% performance increase is an insane efficiency jump. I don't understand how you do not get this.
CPU power consumption will also go up if GPU workload increases and heat output will rise as well. It is a big jump in efficiency in terms of performance per watt, but it also is moving the goal posts at the same time on power draw. It's justified in some instances, but in others not really on the whole not really the goal posts shouldn't be moving or not in that direction if anything.
 
PC power consumption really needs to stop at some point or we will have few kilowatts PCs in the future, which is absurd. 65 watts for a CPU and 150 watts for a GPU seems good to me. There's not much difference in game framerate between a 65 watt 5600 or 5600X and 300 watts 12900K or 12900KS anyway.
 
PC power consumption really needs to stop at some point or we will have few kilowatts PCs in the future, which is absurd. 65 watts for a CPU and 150 watts for a GPU seems good to me. There's not much difference in game framerate between a 65 watt 5600 or 5600X and 300 watts 12900K or 12900KS anyway.

Games do not fully utilize that many cores, because they do not need to, so you can use a lower power CPU without issues.

But with GPUs, they will utilize all you can throw at them, without limits. The difference between a 100 W and a 300 W GPU is gigantic.

Those 600 W graphics cards will be pushed to the extreme, compromising efficiency. Those cards are for people who used to have multi-GPU setups. 3-way SLI was a thing, and Quad CrossFire I think.

Multi-GPU is dead now, so enthusiasts might want something extremely powerful. You will still get great products in the 150-300 W range, and if you undervolt those, you will get exceptional efficiency, just like you can get with a 3080 for example (mine consumes 200-250 W while losing about 10% performance vs. stock 340 W).
 
PC power consumption really needs to stop at some point or we will have few kilowatts PCs in the future, which is absurd. 65 watts for a CPU and 150 watts for a GPU seems good to me. There's not much difference in game framerate between a 65 watt 5600 or 5600X and 300 watts 12900K or 12900KS anyway.
How many times do we need to repeat that the 12900 doesn't consume 300 watts during gaming? Actually, it's a very efficient gaming CPU, way more efficient than most zen 3s.
 
Hi,
As many times as it takes

Some people can't even understand the term Intel system bios default verses amd system bios default when comparing power usage
They just keep insisting the intel chip is more efficient after they alter bios settings wow you think :laugh:
 
Hi,
As many times as it takes

Some people can't even understand the term Intel system bios default verses amd system bios default when comparing power usage
They just keep insisting the intel chip is more efficient after they alter bios settings wow you think :laugh:
No altering bios is needed to make alderlake more efficient in gaming.

I cant fathom that there are actually people that think it consumes 300watts at gaming. Lol
 
If you're gaming or idling around the web, does power consumption really matter all that much? Not everyone is running Blender or whatever 24/7.
 
If you're gaming or idling around the web, does power consumption really matter all that much? Not everyone is running Blender or whatever 24/7.
In light loads, again, alderlake is way more efficient. I see 7 watts in idle and around 15 to 20 when browsing.
 
Stock 12900KS 63W vs 5950X 54W according to TPU so in actuality12900KS stock is 9 watts higher at idle than a 5950X. It's also a chip geared towards multi-thread performance and in that area on efficiency it pummels the 12900KS. Single thread performance isn't as critical as it use to be and multi-thread performance has been getting better and better both on the software and hardware side. In fact if you were to use a single 12900KS or 5950X to replace 2 side by side desktop PC's it's pretty readily obvious which would be a better pick overall.
 
Stock 12900KS 63W vs 5950X 54W according to TPU so in actuality12900KS stock is 9 watts higher at idle than a 5950X. It's also a chip geared towards multi-thread performance and in that area on efficiency it pummels the 12900KS. Single thread performance isn't as critical as it use to be and multi-thread performance has been getting better and better both on the software and hardware side. In fact if you were to use a single 12900KS or 5950X to replace 2 side by side desktop PC's it's pretty readily obvious which would be a better pick overall.
To me, the real difference between Intel and AMD isn't just performance or efficiency, but heat transfer. If you have a standard ATX system and go balls to the walls with cooling, it doesn't really matter which side you pick, as both are awesome for different reasons. On the other hand, if you go small form factor with a compact micro-ATX or even mini-ITX system, and your cooling options are limited, Intel CPUs are just way easier to cool, even if you give them the same power limit as you would your AMD one. Reviews don't tend to look at things from this way, but I know from experience, as I like buying lots of hardware just for the fun of it.
 
To me, the real difference between Intel and AMD isn't just performance or efficiency, but heat transfer. If you have a standard ATX system and go balls to the walls with cooling, it doesn't really matter which side you pick, as both are awesome for different reasons. On the other hand, if you go small form factor with a compact micro-ATX or even mini-ITX system, and your cooling options are limited, Intel CPUs are just way easier to cool, even if you give them the same power limit as you would your AMD one. Reviews don't tend to look at things from this way, but I know from experience, as I like buying lots of hardware just for the fun of it.
Yeah, Intels have been easier to cool and if you check TPU's cooler reviews it tends to show. If you go to their extreme tests you can see that even cheap small tower coolers can keep a 10900k from throttling while the 3900x requires the more expensive stuff.
 
Yeah, Intels have been easier to cool and if you check TPU's cooler reviews it tends to show. If you go to their extreme tests you can see that even cheap small tower coolers can keep a 10900k from throttling while the 3900x requires the more expensive stuff.
But the 5950x is much easier to cool than the 12900k.
 
But the 5950x is much easier to cool than the 12900k.
That's only because of the power budget. Intel decided to limit power to 241 W with Alder Lake, which is crazy. If you slap the same cooler onto both, and limit power to let's say, 125 W, the Intel chip will run cooler.

Another interesting thing is that AMD chips with fewer cores per CCX run hotter, as the same power is consumed by fewer cores (that is, a smaller, more concentrated die space). So a 5600X is hotter than a 5800X and a 5900X is hotter than a 5950X when set to the same power limit.
 
So what? If you limit the AMD CPU that one will also run cooler.
Hi,
Yeah sort of a waste of time repeating this :laugh:

All I have is intel systems and you'd always hit a thermal wall and limit voltage.

But this efficiancy nonsense has just got outer limits with these guys.
 
I have a 12700k, run it stock( it's plenty quick enough) all i have done is set PL1 and 2 to 195W
 
Back
Top