• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen Memory Analysis: 20 Apps & 17 Games, up to 4K

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,867 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
We take a close look at memory performance on AMD Ryzen, using G.SKILL's Flare X modules which are optimized for the new platform. Our testing includes memory frequencies ranging from 2133 MHz all the way to 3200 MHz, with timings from CL14 to CL18. All games are tested at their highest settings in realistic resolutions used by gamers today: 1080p, 1440p, and 4K.

Show full review
 
Last edited:
Something I noticed in couple earlier articles - could you sort the graphs so that better results are at the top?
I know that every chart has "Higher/Lower is better" on it but when there are both kinds on the same page, all sorted from lower to higher, it sometimes gets difficult to read.
 
Good review with lots of benchmarks. But regarding gaming benchmarks you only measured average fps. Average fps don't tell the whole story,
you should measure min fps or maybe better to measure 1% min and 0.1% min. I'm sure that memory frequency will have big effect on those
measurements that relate more to game play smoothness than average fps
 
Well done! I like this angle of testing!!!

And this, people, is exactly why you don't test at low settings and resolutions (read: below 1080p), as it exaggerates results found and do not extrapolate up.

I'd like to see the same testing on Intel and see how the story differs there...
 
Last edited:
There are 5 games that greatly benefit by 13-17% when going from 2133 to 3200MHz RAM (Hitman, FC Primal, Civ6, Fallout4, Warhammer) and most of the others gain very little with Dishonored2 gaining 9%. It depends on the game engine I suppose. So, gaming performance of Ryzen clearly depends on RAM speed, along with game engine optimisations.
 
Yeah interesting read, thank you. So it seems it confirmed that in uhd resolution memory speed does not matter, system is bottlenecked by gpu.

Btw. mySQL bench should read higher is better, TPS means Transactions Per Second. So with more transactions, the better is the performance.
 
Awesome test!

The new agesa updates(latency decrease) will spoil the epic work a bit...
 
Unfortunately, you have to keep going up in frequency to see the gains. 3,600 looks nice from some vids.

If 4,000 is achievable, then you're gonna see the dumb fabric working.
 
How about doing 1% and 0.1% percentile for gaming. Average fps does not tell the whole story, especially with higher ram frequencies.
 
@W1zzard @EarthDog

"The story repeats in our game-tests, where the most difference can be noted in the lowest resolution (1920 x 1080), all of 5.5 percent"

Again, as I've said before, it would be helpful if a low res test could be added eg 1024x768 or even less, so we can know the true fps performance of the processor. Testing only at 1080p and up, it's being hidden by GPU limiting which can kick in and out as different scenes are rendered, so you don't really know fast it is.

Contrary to popular opinion this really does matter. People don't change their CPUs as often as their graphics cards, so in the not too distant future we're gonna see 120Hz 4K monitors along with graphics cards that can render at 4K at well over 120fps. The slower CPU will then start to bottleneck that GPU so that it perhaps can't render a solid 120fps+ in the more demanding games, but the user didn't know about this before purchase. If they had, they might have gone with another model or another brand that does deliver the required performance, but are now stuck with the slower CPU because the review didn't test it properly. So again, yeah it matters. Let's finally test this properly.

Good review otherwise and good to know that it's not worth spending loads on fast, expensive memory. I remember it being a similar situation with Sandy Bridge when I bought my 2700K all those years ago. Saved me a ton of money.
 
@W1zzard @EarthDog

"The story repeats in our game-tests, where the most difference can be noted in the lowest resolution (1920 x 1080), all of 5.5 percent"

Again, as I've said before, it would be helpful if a low res test could be added eg 1024x768 or even less, so we can know the true fps performance of the processor. Testing only at 1080p and up, it's being hidden by GPU limiting which can kick in and out as different scenes are rendered, so you don't really know fast it is.

Contrary to popular opinion this really does matter. People don't change their CPUs as often as their graphics cards, so in the not too distant future we're gonna see 120Hz 4K monitors along with graphics cards that can render at 4K at well over 120fps. The slower CPU will then start to bottleneck that GPU so that it perhaps can't render a solid 120fps+ in the more demanding games, but the user didn't know about this before purchase. If they had, they might have gone with another model or another brand that does deliver the required performance, but are now stuck with the slower CPU because the review didn't test it properly. So again, yeah it matters. Let's finally test this properly.

Good review otherwise and good to know that it's not worth spending loads on fast, expensive memory. I remember it being a similar situation with Sandy Bridge when I bought my 2700K all those years ago. Saved me a ton of money.
It really doesn't matter. I can't agree at all. Sorry. I don't understand what testing lower res with lower settings shows considering people don't play at that res with low settings and higher end cards. So, its a dataset, sure, but I can't wrap my head around its relevance since people barely use it. Again, it exaggerates results which do not extrapolate to a higher res/settings. It doesn't matter and is tested properly IMO..
 
It really doesn't matter. I can't agree at all. Sorry. I don't understand what testing lower res with lower settings shows considering people don't play at that res with low settings and higher end cards. So, its a dataset, sure, but I can't wrap my head around its relevance since people barely use it. Again, it exaggerates results which do not extrapolate to a higher res/settings. It doesn't matter and is tested properly IMO..
I just explained in detail why it matters. Not sure what more I can add to this. :ohwell:

Again, I want to stress that these tests are in addition to the current tests, not to replace them.
 
Just finished reading the whole review and I don't mean to be rude but hasn't this review left out the most important information, the minimum frame rates? You know the ones that are reportedly heavily affected by RAM speed and the reason people are saying Ryzen gets gimped on 2133/2400 RAM...


Sorry. I don't understand what testing lower res with lower settings shows considering people don't play at that res with low settings and higher end cards. So, its a dataset, sure, but I can't wrap my head around its relevance
He literally explained why it's relevant in the post you quoted...
 
I just explained in detail why it matters. Not sure what more I can add to this. :ohwell:

Again, I want to stress that these tests are in addition to the current tests, not to replace them.
I understand what you are saying. I 100% disagree with your assertion (that its relevant)...its just that simple.

What you said doesn't really matter for people (to me - shouldn't for the rest, lol). It shows nothing that extrapolates to a resolution and settings where people actually play. By testing in such an artificial environment, you have created an UNREALISTIC environment to capture what amounts to be an IRRELEVANT data set. The faster CPU down low, at your lower than low settings, and 1080p, will still be the faster chip up top at 4K.

I believe its a waste of time to even add them to the review. Now, the MINIMUM FPS is a good thing to have here.... :)
 
Last edited:
Good luck with 120 fps at 4k. Between lazier coding and cramming in more textures/effects, it's not happening anytime soon.
 
Contrary to popular opinion this really does matter. People don't change their CPUs as often as their graphics cards, so in the not too distant future we're gonna see 120Hz 4K monitors along with graphics cards that can render at 4K at well over 120fps. The slower CPU will then start to bottleneck that GPU so that it perhaps can't render a solid 120fps+ in the more demanding games, but the user didn't know about this before purchase. If they had, they might have gone with another model or another brand that does deliver the required performance, but are now stuck with the slower CPU because the review didn't test it properly. So again, yeah it matters. Let's finally test this properly.

Not really true. As GPUs improve, so do the demands on them. That isn't as true with CPUs. The demand on the CPU, with the exception of a few games like Cities Skylines, pretty much stay the same. This is why a gaming rig with a 1080Ti and a 4.4GHz 2500K is still viable.

At the end of the day, as long as we are getting to the point where we are removing the GPU bottleneck, which is what the 1080p tests with a GTX1080 largely do, there is no point in going lower.
 
I know it's not that big of a deal, but I can note that the FPS gains from the memory speeds are most seen in games where the biggest gap of Ryzen vs. the 7700K are. Some of them are Fallout 4, Hitman and Total War Warhammer. No wonder these games usually give weird and inconsistent GPU results, these games are optimised like fried potatoes.

At the end of the day, as long as we are getting to the point where we are removing the GPU bottleneck, which is what the 1080p tests with a GTX1080 largely do, there is no point in going lower.

I don't think that's accurate. The 2500K falls behind in a lot of tests to the i3s, which (ignoring core clocks) is as strong as the current Pentiums. It has to be overclocked to support the 1080 Ti.
 
Oh just saw this:

It's important to point out here, that at 1080p, games become more CPU-limited, and faster memory is somewhat rewarding (again, 5.5 percent). At 4K Ultra HD, the game is more GPU-limited, and hence the differences aren't are pronounced.

I think that is supposed to be "aren't as pronounced".
 
I understand what you are saying. I 100% disagree with your assertion (that its relevant)...its just that simple.
What he's asking for is a low res test that is CPU limited because the CPU that gets the worse result will be the CPU that starts to bottleneck future GPUs first. It is a relevant test for people who plan to keep the CPU longer than the GPU (almost everyone).
 
Yep. Again, I get it. That is a completely different test than what is going on here though. W1z isn't testing the CPU, he's testing the changes in memory speed/timings in games/apps. But again, in games, the faster CPU at 800x600 is still going to be the fastest CPU at 4K, right (Right.)? Now, if one was testing what Qubit is saying, you would want to get a round up of CPUs and test. Not the same CPU but change memory speeds as is done with this.... = proper testing by isolating the memory speeds from everything else using a REALISTIC testing environment to yield REALISTIC results instead of contrived results from an UNrealistic testing environment.
 
I knotice stuttering in GTA5 a lot.... even though fps rarely drops below 30, mostly hovers above 50.... my system is a bit dated though.

Min FPS scores would be nice, though stuttering doesnt seem to get measured in the FPS through Steam's FPS counter. Dunno how they measure it scientifically.
 
I know it's not that big of a deal, but I can note that the FPS gains from the memory speeds are most seen in games where the biggest gap of Ryzen vs. the 7700K are. Some of them are Fallout 4, Hitman and Total War Warhammer. No wonder these games usually give weird and inconsistent GPU results, these games are optimised like fried potatoes.



I don't think that's accurate. The 2500K falls behind in a lot of tests to the i3s, which (ignoring core clocks) is as strong as the current Pentiums. It has to be overclocked to support the 1080 Ti.

Who has a 2500k that's not OCed to 4.5+? That's like buying an i3...stupid lol
 
But again, in games, the faster CPU at 800x600 is still going to be the fastest CPU at 4K, right (Right.)?
Yeah that's the point, if say (hypothetically) 3200MHz was 15% faster at 800x600 than 2133MHz then in the future it will be 15% faster at higher resolutions with newer GPUs.

The conclusion to the review advises against buying faster RAM because the price increase is bigger than the performance increase show in the GPU limited tests, but you can bet anyone who follows that advice will be mad when their 1480ti is getting bottlenecked by their 2133MHz RAM and DDR4 prices are higher than they were in 2017.
 
Back
Top