• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Can someone explain to me why a 13600k gets more fps in games than my ryzen 5600, even though games are not maximizing usage on the 5600?

zen 4 7900 x3D slower than 9900K at 1080P, yeah that's not cpu, that's the game engine failing to keep up.
Yikes! That's jank! (unless 10 cores/20 threads are better)
 
Yikes! That's jank! (unless 10 cores/20 threads are better)
It's two six core CCD with inter CCD latency.

Single CCD X3D is only worthwhile zen4 X3D.
 
It's two six core CCD with inter CCD latency.

Single CCD X3D is only worthwhile zen4 X3D.
Single-CCD? IIRC, I think that's why Ryzen 7 5800X is so hard to get thermal contact and easily shoots to 85C!
 
It's two six core CCD with inter CCD latency.

Single CCD X3D is only worthwhile zen4 X3D.
it has that it. It has the 5800 X 3D slower than 9900K in those benchmarks.
 
Single-CCD? IIRC, I think that's why Ryzen 7 5800X is so hard to get thermal contact and easily shoots to 85C!
No it's because it's 100+W of juice pushed through a tiny area. X3D chips don't have this issue because they're running much more into their efficiency curve due to cache voltage limits.

Also why Intel with big monolithic is easier to cool than theoretically better ryzen, which despite lower wattage runs hotter.
 
No it's because it's 100+W of juice pushed through a tiny area. X3D chips don't have this issue because they're running much more into their efficiency curve due to cache voltage limits.

Also why Intel with big monolithic is easier to cool than theoretically better ryzen, which despite lower wattage runs hotter.
Mussels said:
It's simple to imagine that a 5800x having the wattage of both CCX's shoved into just one is going to be harder to cool and fussier about the heatsink contact on top.
 
it was a motherboard and ram issue, just too expensive to go next gen right now. also I didn't realize how much DDR4 made me lose 20-30 fps last time i built with that raptor lake. that also pissed me off. once i realized all the reviews were done with ddr5 ram, and how much fps i was losing in game, it pissed me off a bit. cause i was always under the impression ram didn't matter much. my mistake, again lesson learned.

Yeah, before Alder Lake was released, Intel explained DDR5 uplift to be as much as 20%.

I'd personally aim for a cpu you can OC though. Them X3D chips don't OC and some that have bricked the cpu quickly.
 
What you were missing was the idea that AMD are half baked silicon with terrible support from devs, thats why intel and Nvidia always work best where AMD always have its ups and downs

uh oh, not again with the half baked doughnut leftovers from the local bakery store.
 
You can OC DDR4 on raptor lake to get almost as good performance as DDR5. Alot of the initial DDR4 vs DDR5 reviews (HWUB) were done with gear 2 instead of gear 1.
 
You can OC DDR4 on raptor lake to get almost as good performance as DDR5. Alot of the initial DDR4 vs DDR5 reviews (HWUB) were done with gear 2 instead of gear 1.
There is no Gear 1 for DDR5. I suppose you mean G1 Ddr4 vs ddr5.

But early release isn't the reality of today. Ddr5 memory support, clocks and timings, are much better now.

I was surprised to see Intel carry on the ddr4 to 12th and 13th gen. Gives them an edge over AMD being strictly ddr5 on 7000 series.

I jumped to ddr5 at release but went with 16gb and Samsung chips. So its basically a mix of ddr4 with ddr5 bandwidth. Pretty sure they are B-Dies too.
 
What you were missing was the idea that AMD are half baked silicon with terrible support from devs, thats why intel and Nvidia always work best where AMD always have its ups and downs

well my 5600 and 6800 xt system has been rock solid stable since I built it... games run smooth as butter, no issues at all on my end. So not sure what you are on about.
 
well my 5600 and 6800 xt system has been rock solid stable since I built it... games run smooth as butter, no issues at all on my end. So not sure what you are on about.
if your games run smooth as butter, I think you shouldn't worry to upgrade sooner. Save your budget/money for next-gen better GPU and CPU (and let's hope it will get more reasonable price in the future).

IMHO, I prefer to get smooth game play rather than very high fps but its fps often spike and hiccup.
 
Going 7800X3D as recommended by posters above is genuinely a bright idea. ....

6000 MHz RAM is a little bit insufficient tbh, I'd go with something closer to 7 GHz...
Do the new 7xxx x3d chips have a faster IMC than non-x3d chips? If not then 6000 mt/s is a great choice, as faster RAM may cause instability.
 
So should new cpu's like the ryzen 7800x3d be tested with a midrange gpu like the 6800 xt instead of a 4090 (not exclusively of course) to give those of us considering a cpu upgrade a realistic view of how our fps will increase?

Fuck, I worded that off. My head hurts, sorry. I think you get what I am asking though.
Hardware Unboxed does CPU and GPU scaling videos at each generation, just search for that.
Here's one example:
 
chips don't have this issue because they're running much more into their efficiency curve due to cache voltage limits.
And they're just simply lowered clocked chips.

Also why Intel with big monolithic is easier to cool than theoretically better ryzen, which despite lower wattage runs hotter.
Intel's chips are anything but easy to cool, also it's not the wattage that matters here, ryzen chips have higher energy/area density.
 
May I ask @Space Lynx if you play the games in Fullscreen Mode, or do you play it in some kind of borderless / windowed mode? It seams like you got a reasonable Monitor (NZXT Canvas 27Q) at your hands, do you use display port in combination with adaptive sync technology when playing solo-/multiplayer games?
 
May I ask @Space Lynx if you play the games in Fullscreen Mode, or do you play it in some kind of borderless / windowed mode? It seams like you got a reasonable Monitor (NZXT Canvas 27Q) at your hands, do you use display port in combination with adaptive sync technology when playing solo-/multiplayer games?

I am currently playing FFXIV in borderless windowed. Yes, it is a very good monitor, I have been happy with it.

Yes, I use DP cable. Yes, I use Freesync Premium turned on in AMD drivers, no vsync, and I cap frames to 160 with Rivatuner. Every game has worked great for me, perfectly smooth.
 
I am currently playing FFXIV in borderless windowed. Yes, it is a very good monitor, I have been happy with it.

Yes, I use DP cable. Yes, I use Freesync Premium turned on in AMD drivers, no vsync, and I cap frames to 160 with Rivatuner. Every game has worked great for me, perfectly smooth.
If everything is great then just hold off another generation.
 
What kills me about all this, you had the chance to purchase the 13600KF back, and you passed. I even gave you time to decide!!!

Don't beat yourself up over it.

X3D chips don't overclock. Great performance, but there isn't an increase to search for.

Got the cash? 13700K. 5.3ghz all core out of the box on any decent air cooler.

We can talk about e-cores some other time. For half the benchmarking I do, they are disabled.
Isn't that the same 13600KF I have now? One more case mod and the guts go into it. Both my 12600K and 12700K rigs are running at 5.3GHz now but they're running DDR4. I have 64GB (32x2) of G.Skill DDR5 6400 Trident Z5 Hynix A-die for the 13600KF.
 
Last edited:
Motherboard quality, ram, crap running on the background, chipset drivers, bios settings, etc... so many things can be making a difference even if it seems like a normal upgrade. Why just assume it was the cpu?
 
Motherboard quality, ram, crap running on the background, chipset drivers, bios settings, etc... so many things can be making a difference even if it seems like a normal upgrade. Why just assume it was the cpu?

Because they are completely different class of CPUs - you will actually get much lower FPS with a 5600... it also costs 1/3 as much so. The other things do matter as well, for sure, but the biggest glaring difference would be CPU.

1680696932329.png


The dips are also much less pronounced on the newer chips, so it actually does feel quite different in games. I game at 4k and I could instantly tell difference in Cyberpunk between 12600K OC'd at 5.3 and 13700KF @ 5.6 - the FPS counter was only about 10% higher but it felt much smoother. Not all games many of them are exactly the same - but stuff like hogwarts legacy, spider man etc...

There is no Gear 1 for DDR5. I suppose you mean G1 Ddr4 vs ddr5.

But early release isn't the reality of today. Ddr5 memory support, clocks and timings, are much better now.

I was surprised to see Intel carry on the ddr4 to 12th and 13th gen. Gives them an edge over AMD being strictly ddr5 on 7000 series.

I jumped to ddr5 at release but went with 16gb and Samsung chips. So its basically a mix of ddr4 with ddr5 bandwidth. Pretty sure they are B-Dies too.

They compared Gear 2 DDr5 vs gear 2 DDR 4 -- a low latency quad rank (4 sticks) b-die kit at 4000+ with tuned subs will beat / match a DDR5 CL36 6000 kit -

Source: me -- I went from a 12600K gear 1 on a DDR4 full atx board into an itx system using ddr5 (needed 64gb and DDR5 has the density) and I actually lost a bit of fps. FPS chasers also did a comparison with a 13600K and came to the same conclusion. Basically very fast DDR4 = mid range non overclocked DDR5.
 
Last edited:
I game at 4k and I could instantly tell difference in Cyberpunk between 12600K OC'd at 5.3 and 13700KF @ 5.6 - the FPS counter was only about 10% higher but it felt much smoother.
Probably a lot of placebo there, especially at 4K.
 
Probably a lot of placebo there, especially at 4K.
The 12600K was maxed and capping in the city - FPS counter was in the high 80's but game felt like 45-50 fps, very jerky / horrible frame pacing:

1680706148788.png


4k with DLSS and RT ON - definitely not placebo... I'm too OCD to write stuff like that off redid the testing with different bios settings etc.
 
Back
Top