• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

First Game Test With the Ryzen 7 5800X3D Appears as Promised

Intel is running with ddr5 4800c40. Come on, people run it at 7000+ c30 or c32 already. And AMD is running with 3200c14 while my AMD system runs at 3800c13.
As true as that might be but default DDR5 speeds 5200 and 5600MHz sadly ain't much faster than good DDR4 kit and most pre-builds with DDR5 at least in my country.

3200MHz CL16 = 10ns
3600MHz CL18 = 10ns
5200MHz CL38 = 14.615ns

So the higher speed and CL DDR5 memory is at default for people on the cheap stuff the faster DDR4 sadly is en latency if the calulation is right.

Link: https://notkyon.moe/ram-latency2.htm

azbymOb_700b.jpg
 
DDR5 needs 5200CL30.
 
Brothers, there is something I don't understand about the test that was filtered from 5800x3D. According to the leaker, he tested the Shadow of the tomb Raider in 720p, low graphics. And there is a difference between the 12900k, which with DDR5, gives 190fps with a 3090ti and the one from AMD gives 231 fps with a 3080ti. However, how is it explained that my 12600k, with a 3070 and DDR4 3600 mhz, on a B660 Board, can achieve 201 fps, in that test? Here I send capture of my test.
 

Attachments

  • SOTR 720p 12600k +3070 Lara.jpg
    SOTR 720p 12600k +3070 Lara.jpg
    1.5 MB · Views: 137
So? 1. 8 cores is plenty for gaming
lol, ive seen this baaaaack in the good ole days of 3570K vs FX 8350 daaays

How the table have turned
 
Brothers, there is something I don't understand about the test that was filtered from 5800x3D. According to the leaker, he tested the Shadow of the tomb Raider in 720p, low graphics. And there is a difference between the 12900k, which with DDR5, gives 190fps with a 3090ti and the one from AMD gives 231 fps with a 3080ti. However, how is it explained that my 12600k, with a 3070 and DDR4 3600 mhz, on a B660 Board, can achieve 201 fps, in that test? Here I send capture of my test.

You need to take to account that most "normal" DDR5 buyers that doesn't know about timmings and latency buys a generic set of memory and this have been seen before that lower speed bad timing memory doesn't benefit Intel's 12gen line up and prices overall makes DDR4 with 3200 CL16 or 3600CL18 a much better buy than the cheapest DDR5.

Intel also said they would strongly advice against using DDR4 on their 13gens CPU's even they support it what's up with this I am not sure maybe they just want you to steal your kidney I don't know but it's Intel's nutshell policy.

I made this as a joke but if AMD really hits it off with their V-cache it should be good for transcoding with the extra cache and other work related stuff
 
Going to wait for the real benchmarks......i still want it tho.
And yes, i know it's probably not worth the upgrade over the normal 5800X...
 
It's just like pre-orders... Wait for the real benchmarks to come out, I doubt it'll be very good in some games but much better in others. Either way a very innovative product, very interesting solution to the node shrink issues.
 
As true as that might be but default DDR5 speeds 5200 and 5600MHz sadly ain't much faster than good DDR4 kit and most pre-builds with DDR5 at least in my country.

3200MHz CL16 = 10ns
3600MHz CL18 = 10ns
5200MHz CL38 = 14.615ns

So the higher speed and CL DDR5 memory is at default for people on the cheap stuff the faster DDR4 sadly is en latency if the calulation is right.

Link: https://notkyon.moe/ram-latency2.htm

View attachment 243052
While those calculations are absolutely true, they don't quite translate directly to reality in terms of real-world latencies due to the architectural/signalling differences between DDR4 and DDR5. DDR5 is quite a dramatic change compared to previous DDR generations, making comparisons more difficult than previously. (Surprisingly, LTT did a pretty decent explainer on this.) Most Alder Lake reviews showed DDR5 performing slightly better than DDR4 even in latency-sensitive workloads (and in non bandwidth sensitive workloads) despite the spec sheet seeming to indicate that it would be quite a bit worse.
 
While those calculations are absolutely true, they don't quite translate directly to reality in terms of real-world latencies due to the architectural/signalling differences between DDR4 and DDR5. DDR5 is quite a dramatic change compared to previous DDR generations, making comparisons more difficult than previously. (Surprisingly, LTT did a pretty decent explainer on this.) Most Alder Lake reviews showed DDR5 performing slightly better than DDR4 even in latency-sensitive workloads (and in non bandwidth sensitive workloads) despite the spec sheet seeming to indicate that it would be quite a bit worse.
Still other reviewers like gamers nexus shows that DDR4 can still give DDR5 on the Alder lake platform a run for it's money.
 
Still other reviewers like gamers nexus shows that DDR4 can still give DDR5 on the Alder lake platform a run for it's money.
It absolutely can, especially with higher clocked RAM - but that's mostly due to DDR4 being extremely mature and DDR5 being the opposite. Current DDR5 clocks are very low even compared to the current range of JEDEC speed specs, let alone the planned JEDEC range expansion (up to 8400 IIRC). Once DDR5 matures a bit it'll leave DDR4 behind properly. I mean, we're barely seeing better-than-JEDEC timings on the market at all.
 
Going to wait for the real benchmarks......i still want it tho.
And yes, i know it's probably not worth the upgrade over the normal 5800X...
if you are gaming at 1440p and above most likely won't be worth it.
 
  • Like
Reactions: SL2
It's about the price also i'm pretty sure this is gonna be a limited run just like the 3300x was which will drive the price even higher.
Who knows? until its actually in stores your all guesssing but for me its a no brainer, cheaper in every way if you havent seen the Prices in Aus?
lol, ive seen this baaaaack in the good ole days of 3570K vs FX 8350 daaays

How the table have turned

Name a game that uses all 16 Threads

The the 8350 (well 8 core CPU) ended up been the better choice it seems
 
And how do you arrive at that conclusion?
The idea is flawed to begin with, as that is not how CPU power allocation works when game studios develop games. It is also not how scaling works, you can run into faux constraints, that developers simply didn't give a f*** about and all sorts of flukes.

As for "but what about in practice", we had Ryzen vs Intel, where Intel was showing "edge" at low resolutions, which was supposed to hint at huge advantage in the upcoming games, but when the said games arrived, its advantage evaporated.


High FPS low resolution testing makes sense when applied using ACTUAL USE CASES such as competitive shooters, but then, it's never 720p to begin with.

Oh, and it is also not Tomb Raider.
 
As for "but what about in practice", we had Ryzen vs Intel, where Intel was showing "edge" at low resolutions, which was supposed to hint at huge advantage in the upcoming games, but when the said games arrived, its advantage evaporated.
I wonder if there are any tests that focuses around this. Run 2018 games with CPU's and GPU's from the same year, then use a new GPU with the old CPU's and run 2021 games.
 
Brothers, there is something I don't understand about the test that was filtered from 5800x3D. According to the leaker, he tested the Shadow of the tomb Raider in 720p, low graphics. And there is a difference between the 12900k, which with DDR5, gives 190fps with a 3090ti and the one from AMD gives 231 fps with a 3080ti. However, how is it explained that my 12600k, with a 3070 and DDR4 3600 mhz, on a B660 Board, can achieve 201 fps, in that test? Here I send capture of my test.
just wait for the real benchmark on TPU or Guru3D
 
Name a game that uses all 16 Threads
yeah well that's the kind of response the Intel die-hard always says up until Coffe Lake releases

The the 8350 (well 8 core CPU) ended up been the better choice it seems

lol, cant believe i just saw "Better Choice" and "8350" in the same sentence :D
 
Who knows? until its actually in stores your all guesssing but for me its a no brainer, cheaper in every way if you havent seen the Prices in Aus?


Name a game that uses all 16 Threads

The the 8350 (well 8 core CPU) ended up been the better choice it seems
Ashes of the Singularity
 
Back
Top