• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Shows First Ryzen 7 7800X3D Game Benchmarks, Up To 24% Faster Than Core i9-13900K

1) who runs their PC 24/7 load? Unless you donate time to a project like Folding@home?

This is kind of a strawman argument. You don't need to run your PC under load 24/7 to see the savings. Even with half that time it's still over $100 in savings excluding any environmental cooling costs.

The power difference between the 7950X3D and 13900K in mixed workloads is nearly 100w as well so even in mixed workload scenarios the 7950X3D is going to consume vastly less power.

That said I'd challenge the notion that most people buying a 7950X3D aren't putting all those cores to use. If they aren't they are wasting their money. It still takes me 1 1/2 days to run a high quality AV1 encode on a 7950X.

There are certainly other professions that will be putting these CPUs to work around the clock whether that be encoding, rendering, ect.

2) You can set eco mode to 65W on regular chips and they will have the same energy savings without any additional tweaking needed.

65w will tank performance on a 13900K. The 7950X also looses performance at 65W as well. Mind you nothing is stopping you from enabling eco mode or tweaking these X3D chip's either. ECO and PBO power settings are available on all Ryzen 7000 series processor.

The only reason to get these 3D chips is for gaming or a workload that really benefits, if there is any outside of games.

Bigger energy savings would be had from laptops because desktop base power is relatively large due to inefficiency but really the consumption here is not that much in most people’s energy use.

Would the energy savings actually be larger if you switched to a laptop though?

AMD's comparable mobile processor, the 7945HX has a TDP between 55 and 75w and consumes over 100w during a heavy multi-core workload. When limited to 100w, it scores 33487 points in CB R23 MT. Meanwhile the 7950X3D scores 35693 when set to prefer cache and consumes 141w. Given that even at stock it's within sticking distance of the very new 7945HX, it's very possible that when limited to 100w the 7950X3D can match or even beat the power efficiency of the mobile part. It's also good to note that the 7945HX is only a few days old and is 36% faster than the i9-13980HX when both are limited to 100w so it's by far and away the most efficient part for this comparison.

Source for 7945HX info: https://www.notebookcheck.net/AMD-s...te-much-lower-power-consumption.698349.0.html

I think you are under-estimating the efficiency of the 3D cache, even without the power binning a top end laptop SKU goes though or the power optimizations you have the 7950X3D which could feasibly match a laptop chip in power efficiency.

I can think of many regions in the world where either electricity price or climate heavily incentivizes a 7950X3D or 7800X3D purchase, aside from it's class leading performance.
 
Are the 8 core cached ccd yields in trouble based on the limited stock of the 7950-x3d that have not been restocked 11 days post launch on AMD'S website, Microcenter, Bestbuy, Amazon ( besides scalped pricing, Newegg besides scalped pricing, B and H, and gamestop. In case anyone is interested in the 7900x3d there is still dozens of cpus across the 25 Microcenter stores available now.
 
Looking forward to Wizz review on this one. Should be interesting to see how it goes up against the AM4 Champ (5800X3D).
 
24h 100% load every day?
going from a 5800x to a 5800x3D took my gaming wattages down about 50W, absolutely noticeable in summer here

you dont need 24/7 load to notice a significant difference in power consumption and heat output
 
Are the 8 core cached ccd yields in trouble based on the limited stock of the 7950-x3d that have not been restocked 11 days post launch on AMD'S website, Microcenter, Bestbuy, Amazon ( besides scalped pricing, Newegg besides scalped pricing, B and H, and gamestop. In case anyone is interested in the 7900x3d there is still dozens of cpus across the 25 Microcenter stores available now.

I don't think so, AMD made the cache die even smaller this time so they should be yielding more per wafer. It's more likely that AMD is allocating more 3D Cache chiplets to higher margin markets as they are pushing for a large gain in server marketshare.

AMD have been prioritizing server CPUs over desktop for the last few years. A good example of this is the 7950X compared to the 5950X. The only thing that made the 5950X very efficient was that it used higher binned chiplets. The 7950X doesn't see that same jump in efficiency though, as AMD decided that it would rather seller those higher quality chiplets to enterprise customers at a higher markup.
 
Up-to-24% doesn't sound out-of-whack to me considering we know this is based on select Ryzen-warming titles. Looking at TPUs ccd1 disabled 7950X3D 1080p gaming benchmarks, already a couple of titles are showing 15-23% increase in performance over the 13900K. Actually its better to break them up as the cyberpunk result may need revisiting.

Battlefield +15%
Borderlands +18%
Cyberpunk +23% (really?)

(AMD loyalists dont get too excited. Intel loyalists dont get your panties in a twist - intel also enjoys similar performance advantages with select titles. Just thought i'd get that out there before the loyalist blah blah ensues).

I'm more interested in the 5800X3D/13700K and 7800X3D showdown... although i don't expect much more than the already reported (7950X3D) typical generational increase in performance.

In terms of value, at the moment it seems like an average 15% performance uplift over the 5800X3D.... not enticing enough to blow $450 + AM5 adoption tax. I actually warmed up to AMD big time and was considering buying into AM5 coming from an intel 9700K.... but nah, kinda boring now and they've done a fine job delaying the living day lights out of an X3D gaming chip and not to mention the asking price (almost half a grand for heavens sake).
 
Up-to-24% doesn't sound out-of-whack to me considering we know this is based on select Ryzen-warming titles.
Borderlands 3/Wonderlands for example show some insane gains with x3D chips - look how the 5800x to the 5800x3D went

1678593592578.png


This is from TPU's emulated 7800x3D testing, but they're showing a possible 50% gain in this game engine, so 25% in multiple titles does seem entirely plausible
 
I see from one of the AMD slides attached to this news article that there is a claim that the 7950X3D has a 50% advantage over the 13900K in file compression, see the graphic below

amd7950X3Dvs13900K_01.png


But the TechPowerUp review of February 27 this year shows it as much less than that. In WinRAR it shows a narrow win for the 13900K as this extract shows

amd7950X3Dvs13900K_02.png


and in 7-Zip although the 7950X3D is on top, the advantage is only around 5% as this screenshot from the review shows

amd7950X3Dvs13900K_03.png


I appreciate that the wording "UP TO" is being used, but where did the 50% come from?
 
Last edited:
rar vs zip, type of compressible content...
 
If that`s the best ZEN4-3D can do it is 'nice', but nothing special to talk about. The great power efficiency is offset by the high platform cost. In the high-end for app you have real competition, all else is in Intel hand. The total cost is still too high with AMD for most and the lower you go in tier, the more pronounce it is.
We can just hope than ZEN4 is ironing all the problems for the new platform so ZEN5 will see smooth lunch.
 
If that`s the best ZEN4-3D can do it is 'nice', but nothing special to talk about. The great power efficiency is offset by the high platform cost. In the high-end for app you have real competition, all else is in Intel hand. The total cost is still too high with AMD for most and the lower you go in tier, the more pronounce it is.
We can just hope than ZEN4 is ironing all the problems for the new platform so ZEN5 will see smooth lunch.

What high platform cost? Motherboard is $140. Best 5600C28 ram is $130. 32GB of it!
 
How many people
As someone who is currently using a 5800X3D there is little reason to upgrade, the Big cost of doing so isn't worth it for "20-25%" and I doubt the performance increase would be noticeable in World of Warcraft. Going from a 5800X to the X3D was such a massive improvement I can't imagine that the 7800X3D would be any better without Also getting a Much stronger GPU than the RTX3080 I have.

You're at 1440p 144hz? Should max it out with 5800x3d and 3080?

You'd probably be better off upgrading the 3080 than going am5 indeed.

These benchmarks are at 1080p, totally useless for pretty much anyone who's dropping the money on am5 and a top gpu.
 
I suggest that for every "up to" figure you will be obligated to add "down to" figure.
 
Would be great if Intel dropped off these E-Core gimmicks. Just plain P-cores. I have found 0 usage for those notepad-tier CPU cores, disabled in BIOS already.
Try warzone 2 with ecores on and off (or cyberpunk, or spiderman) and youll change your mind. 1% lows fall flat on their face with ecores off
 
Try warzone 2 with ecores on and off (or cyberpunk, or spiderman) and youll change your mind. 1% lows fall flat on their face with ecores off
That's not the point. You could have had more P cores instead. Then you'd be better than with E cores on.
 
That's not the point. You could have had more P cores instead. Then you'd be better than with E cores on.
That's a completely different issue that's debatable. What's not debatable is that disabling ecores on the current CPU's loses you performance in plenty of games. There is a game here and there (havent had any myself) that may work better with ecores off, but if you find such a game, you can disable the ecores with a press of a button from within windows on the fly
 
Well being 4k gamer I only care about this
minimum-fps-3840-2160.png


Efficiency matter very little when I pay 9usd a month on my entire PC, 50W less mean I pay 1usd less per month ;)
 
To be honest though, I don't see people buying the 7800X3D and then decide to play at 1080p, i expect people who'll buy it to either want to play at 1440p or 4K, where it'll probably be GPU-Bound.
testing on 1080p and below is done to see how much cpu performance headroom is available there, not that people who buy 4090s will actually play on a resolution like that
 
testing on 1080p and below is done to see how much cpu performance headroom is available there, not that people who buy 4090s will actually play on a resolution like that
Precisely why i think arguing about which CPU is better for gaming is pointless in my honest opinion, in real-world usage, it'll be difficult to notice the differences between the top performing gaming CPUs with a 4090, and the differences will be even less noticeable with the GPUs that people are more likely to use (3080s and 6800/6900 XTs.)
 
There's no reason to expect that the 7800x3d will do substantially better.
Given the price, why would it?

As for benchmarks, I remember the disabled 7950X3D being at least 21 % faster in one of the games in the TPU review, so I don't doubt up to 24 % for a second.
 
Back
Top