• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Gaming benchmarks: DDR4 2133 MHz VS DDR4 3000 MHz (Core i7 6700K)

Repeating something I understand and don't agree with doesn't help anything. Neither does your insult... I'm not a child (stubborn, sure).

Let me try the same thing...

By taking the best result, you are posting a best case scenario, not an average or a mean. By taking only the best result, this exaggerates the result. Accidental lag, as you say, is accounted for in average... that's why it's an average. You can also throw out the lowest and highest results and accomplish the same thing. As it stands though, you are discarding all results but the fastest. You are manipulating results by taking only highest values to get an expected outcome. I simply don't agree with that methodology. That's all.

...does repeating the same thing help? Probably not. ;)


We agree to disagree is where we end up... and that's OK (particularly if you don't resort to insults...)! Lead the lemmings! :)

I digress.
 
Last edited:
the highest results are PEAK results. The best his test system can do. I dont think that manipulating anything. It may be a false positive, but its not manipulating deliberately, IMO.

edit: There are a niche of consumers that actively look to buy products that give the best results vs. average results while marketing promotes around the above average product since it sells better over a wide range of consumers. This review shows results that niche might just want to see vs an average results review.
 
Last edited:
Sorry for my ignorance but why most of the Game are tested at 1920x1080?

Honestly, that's something i have never understood.
 
Sorry for my ignorance but why most of the Game are tested at 1920x1080?

Honestly, that's something i have never understood.
because 4k is not wide spread enough yet?
 
Taking the highest FPS results for the end result as data is not "manipulating", it's just going the "optimistic way".

3200 Ram is the minimum I'd go for whether i7 or Ryzen, the price difference isn't high compared to 2400 or 2666 and lower than that is a no-go anyway. So I don't see why people here are still discussing it. It's 2 games or more where it's scaling well and it will probably be even more games if you take in account all games ever released and some games that will be released in future as well.
 
One thing that is missing is the MMO factor, and the ancient question....

"How many other characters does it take to lag someone out and make them crash."

In EQ1 days, raiding meant your system/card had to survive 50-60 people, while mostly on dial-up, and continue to handle whatever animated content there was. Which meant spending $$$$ on anything and everything to accomplish. Thats where i see a discussion like this somewhat valuable. Sure it prolly has a minimal impact, but there are those with too much money to burn and would pay for that 1% just to have it.
 
Sorry for my ignorance but why most of the Game are tested at 1920x1080?

Honestly, that's something i have never understood.

In my case it has to do with limiting GPU bottleneck. Even at 1080 the GTX980 Ti is somewhat bottle-necked in many cases. When i will do a video card benchmark next time i will use an appropriate resolution for appropriate performance video cards, but when it comes to other hardware parts, there is no need to exaggerate, especially since 1080 is the mainstream resolution world wide.

Taking the highest FPS results for the end result as data is not "manipulating", it's just going the "optimistic way".

That is correct. The highest recorded FPS means that the system component can do that level of performance. Of course this is taking into consideration that i am searching for the best minimal FPS result, but writing the medium and maximum FPS in that same line, and not searching for highest medium and maximum in other lines... But i think you already expect this guys.
 
Last edited:
What do the numbers look like averaged? Or take out highest and lowest and average those? I'm curious to see if that changes results any...
 
What do the numbers look like averaged? Or take out highest and lowest and average those? I'm curious to see if that changes results any...

I will give an example with Metro Last Light Redux to show:

1 run: 97/103/107
2 run: 101/106/108
3 run: 102/107/111
4 run: 103/108/112
5 run: 101/107/113
6 run: 102/109/112
7 run: 103/107/111

So out of these runs i select Nr. 4 since the hardware component scored the best minimal FPS of 103. In some other runs i see higher average and maximum FPS, but i stick to one line, in which the best minimum is, and so by it the average and maximum follow... Usually the very first runs in most games are not that good, perhaps the needed info is only being written into CPU cache and therefore only there following runs are accelerated due to fast offlaod from CPU cache (this is just a theory).
 
1 run: 97/103/107
2 run: 101/106/108
3 run: 102/107/111
4 run: 103/108/112
5 run: 101/107/113
6 run: 102/109/112
7 run: 103/107/111

If you were to average them as most reviewers would do, you will find run three is closest to reality, not run #4. I would not pick and choose results randomly, and I think this is what @EarthDog was getting at as well.
 
If you were to average them as most reviewers would do, you will find run three is closest to reality, not run #4. I would not pick and choose results randomly, and I think this is what @EarthDog was getting at as well.

Yes, i agree that such calculating would get the results closer to "standart feeling", but by calculating average results the difference between the tested components would be much smaller and much more random than what they could really produce at their peaks. This average method might work well for video cards, but when it comes to measuring hardware that does not impact FPS so much as video cards, i choose the best peak results from both sides. Granted, if i was to test synthetic and productivity tests, i would measure the average results as well, but not in this case.

Here is an example: Core i3 6320 and Core i5 2500 produced similar average FPS in games, yet in minimal FPS Core i5 2500 was seen stronger. So if i was to measure the average (sum the rows) results, they would look much closer than what they really are, but if i am to measure the best minimal FPS from both of them (and in that case Core i3 6320 would produce smaler numbers), i really would show why Core i5 2500 is the better processor.

I appreciate both of your inputs, but i will stick to my methodology for some specific hardware parts like this.
 
Last edited:
I appreciate both of your inputs, but i will stick to my methodology for some specific hardware parts like this.

No offense, but save yourself the time then. If you just want a look at close to peak results, run the test once to get rid of the bad mojo, then just record the second run as best. There is no point to running it that many times not to take the average imho.
 
If you were to average them as most reviewers would do, you will find run three is closest to reality, not run #4. I would not pick and choose results randomly, and I think this is what @EarthDog was getting at as well.
No offense, but save yourself the time then. If you just want a look at close to peak results, run the test once to get rid of the bad mojo, then just record the second run as best. There is no point to running it that many times not to take the average imho.
Exactly.

Manipulating really had much more of a negative connotation than I wanted to relay (sorry about that Artas). Choosing results randomly, I think is more accurate. Or, at least I don't agree with how its chosen based off theories of HDD's and cache and... etc. Do people start games twice to 'shake off the cache'? I don't think so. Besides, this is what running it multiple times does already. All you are doing is taking the best result of all which isn't as accurate a representation of the results one will see. They play a game and run it.

I don't understand why these data sets are any different than measuring graphics cards etc.. I mean its an FPS value you are measuring to determine if faster memory helps and all. The 'fastest' results don't mean a thing in this case...you'd still want to average it out to get rid of the anomalies (the fastest result you insist on using).

...thought I digressed somewhere in here already, LOL!
 
Last edited:
Sorry to dig this up, but am I right in assuming that 2133 RAM with CL 13 timings would not lag behind by that much? I.e. it will be faster than the 2133 CL 15 timings RAM used in these tests.
I moved from a single stick of 8Gb DDR4 2400 CL17, to a 16 Gb Kit (4x4) of Corsair Vengeance LPX 2133 CL 13, and it was a huge difference. Using an i5 8400, and a GTX 1070, I used to get an Average of about 70-75 in BF1 64 Player MP, with dips to 50. After the upgrade, it went to an average of 80-90, with no dips below 70.

I just wanted to say this because I often see reviewers say we should never use 2133 MHz RAM since it'll hold you back, but most don't reveal the timings of their RAM. Only a "2133 vs 3200" etc. comparison. My 2400 CL17 would have a true latency of about 14 ns, quite like the 2133 CL15 RAM used in these tests. The 2133 CL13 has a true latency of about 12.16 ns, forming somewhat of a midpoint between the 2133 CL15 used here, and the 3000 CL15 (with a true latency of about 10 ns).
 
It's been proven already that higher frequency RAM increases performance in gaming just like in content creation, whether just by little or considerably, dependent on the game. My problem with all of those showcases - there are far too little games tested. So right now and here i will add more.

For this specific test i have assembled a new bench computer with my own Asus GeForce GTX980 Ti Strix borrowed for the test. You can see the specs of my new PC in the screenshot. Don't mind the Windows 7 "not genuine" notification - i only installed a fresh ISO copy of Windows for testing and i am not activating it just for that! That being said, i obviously use legal Windows 7 for my main Core i7 5775C PC.

ddr4-testing-pc.jpg


Intel Core i7 6700K 4500 MHz OC.
Gigabyte GA-Z170XP-SLI
Corsair Vengeance 2X8 GB DDR4 3000 MHz 15-17-17-35 (XMP1)
Corsair Vengeance 2X8 GB DDR4 2133 MHz 15-15-15-36 (SPD)
Plextor M8PG 256 GB NVME PCI-E 3.0
Asus GeForce GTX980 Ti Strix 6 GB


25 games have been tested. Testing on 1920x1080 resolution using maximum available in-game presets or simply maximum available settings with no AA if possible. All of the testing scenes have been tested many times in a row and the best results were obtained from both DDR4 frequencies using the same memory kit.
-------------------------------------------------------------------------------------------

VIDEO PRESENTATION

ALAN WAKE AMERICAN NIGHTMARE

alan-wake-american-nightmare.jpg


Obvious improvement in minimal and average FPS. Tested 5 times for each frequency.

ALIEN ISOLATION

alien-isolation.jpg


Not much of a difference. Tested 5 times for each frequency.

ARMA 3 APEX

arma-3-apex.jpg


Arma 3 is exceptional... The difference between DDR4 3000 MHz and DDR4 2133 MHz is actually bigger than the difference between GTX980 Ti and GTX970 - that is how much this game is CPU bottle-necked even with Core i7 6700K 4500 MHz!!! I was shocked!!! Tested 5 times for each frequency.

ASHES OF SINGULARITY

ashes-of-singularity.jpg


CPU heavy game? Certainly does not look so from the benchmark. Tested 2 times for each frequency.

ASSASSINS CREED SYNDICATE

assassins-creed-syndicate.jpg


Little difference. Tested 5 times for each frequency.

BATMAN ARKHAM ORIGINS

batman-arkham-origins.jpg


Some performance gain is evident. Tested 3 times for each frequency.

BATTLEFIELD 1

battlefield-1.jpg


Little difference. Tested 5 times for each frequency.

CALL OF DUTY BLACK OPS 3

call-of-duty-black-ops-3.jpg


Not much of a difference. Tested 12 times for each frequency.

COMPANY OF HEROES 2

company-of-heroes-2.jpg


The very first test is the most important and valuable, since every other test after the first is droping performance consistently more and more. The difference is certainly evident.

CRYSIS 3

crysis-3.jpg


I have to say i have seen bigger gains in the internet, perhaps the test is not stressful enough... Tested 7 times for each frequency.

DYING LIGHT

dying-light.jpg


Little difference. Tested 5 times for each frequency.

DOOM

doom.jpg


Little difference. Tested 5 times for each frequency.

DRAGON AGE INQUISITION

dragon-age-inquisition.jpg


There is no difference whatsoever. Tested 5 times for each frequency.

FAR CRY 4

far-cry-4.jpg


Almost no difference. Tested 7 times for each frequency.

MAD MAX FURY ROAD

mad-max-fury-road.jpg


Little difference. Tested 5 times for each frequency.

METRO LAST LIGHT REDUX

metro-last-light-redux.jpg


This is huge! Unfortunately the maximum FPS do not matter that much, yet those high 300+ FPS for 3000 MHz DDR4 were constant in every test. Tested 12 times for each frequency. This test alone took me a whole hour to make!

MIDDLE EARTH SHADOW OF MORDOR

middle-earth-shadow-of-mordor.jpg


Funny how 2133 MHz DDR4 actually won here. Little difference. Tested 4 times for each frequency.

MIRRORS EDGE CATALYST

mirrors-edge-catalyst.jpg


No difference. Tested 5 times for each frequency.

PROJECT CARS

project-cars.jpg


Huge performance difference in this game. Tested 7 times for each frequency.

QUANTUM BREAK

quantum-break.jpg

quantum-break.jpg


Trash this game! This is the worst optimized game i've ever seen and it looks nowhere as good as Crysis 3 or Battlefield 1 (speaking about objects, filtering and lightning, color tone, not face work, which actually looks good). Core i7 6700K 4.5 GHz, 16 GB DDR4 3 GHz and GTX980 Ti OC can not run this game at 1080 with highest preset and no AA? ARE YOU SERIOUS? WTH IS THIS SHIT? Anyway, you will want to have the highest frequency RAM for this game available. Tested 5 times for each frequency.

RAINBOW SIX SIEGE

rainbow-six-siege.jpg


It appears that high frequency RAM improves maximum FPS performance the most. Tested 7 times for each frequency.

RISE OF TOMB RAIDER

rise-of-tomb-raider.jpg


FPS increase from high frequency RAM is obvious. Tested 7 times "Mountain Peak" for each frequency.

THIEF

thief.jpg


Similar FPS increase in Thief just like in Tomb Raider is evident. Tested 4 times for each frequency.

WATCH DOGS 2

watch-dogs-2.jpg


Little difference. Tested 5 times for each frequency.

WITCHER 3 WILD HUNT

witcher-3-wild-hunt.jpg


I could not record any difference, although i've seen obvious improvements from high MHz RAM in this game tested elsewhere. Perhaps this testing scene is just not stressful enough. Tested 5 times for each frequency.
-------------------------------------------------------------------------------------------

CONCLUSIONS

1. Higher frequency RAM does increase gaming performance - mostly by little, but in some cases notably.

2. It is not worth selling your basic DDR4 and getting new high DDR4 from first hand retailer full price.

3. It is worth upgrading your RAM if it will only cost as much extra money, as much extra performance you get in return - that being said, it is not worth selling your 16 GB DDR4 2133 MHz for 60 EU just so get 16 GB DDR4 3000+ for 120 EU for an extra 10 % FPS improvements.

If 2133 will run 2400, 2667, 2933, 3000, 3200, go to your ram makers site and get the ram timings and voltages and try to overclock them
 
Awesome as always, do you plan or do you have an Ryzen APU's? like the 2400G? I would love to see someone give some benchmarks when running different speed RAM over 3200MHz as I havent found any that seem to benchmark with RAM much over that speed sadly.
 
Check out this video:

 
From what I understand, it is not just ram speed but CAS latency. I think techspot or techradar or guru3d did a review on it, but CAS 14 ddr4 3200 is the sweet spot, after you go higher than that its very minimal gains.

I paid $352 for Dark Pro 32gb (8x4) CAS 14-14-14-31 3200. Little pricey, but worth the extra cost to me. Will last me until DDR5 and my 2020-2021 build.
 
From what I understand, it is not just ram speed but CAS latency. .

RAM latency is more important than speed. Speed is defined by how fast you accomplish a task, not by how many tasks you can accomplish ''at all''. It's like comparing Formula 1 car vs. any supercar. Who gives a shit if supercar can reach higher top speed if it will be 40 % slower in a race track??

That being said

I've tested DDR3 2400 MHz CL11-13-13 VS DDR3 1600 MHz CL7-7-7 and ye, DDR3 1600 MHz CL7 was way faster.

This DDR4 test, however, was not about speed vs. latency, since the latency on my tested 3000 MHz DDR4 is nothing special.

I moved from a single stick of 8Gb DDR4 2400 CL17, to a 16 Gb Kit (4x4) of Corsair Vengeance LPX 2133 CL 13, and it was a huge difference. Using an i5 8400, and a GTX 1070, I used to get an Average of about 70-75 in BF1 64 Player MP, with dips to 50. After the upgrade, it went to an average of 80-90, with no dips below 70.

This is true, although the results are staggering. I would expect 5 FPS drop at minimum, not 20... Testing in multiplayer is not great, since the scenes are not consistent though...

Awesome as always, do you plan or do you have an Ryzen APU's? like the 2400G?

No no no. I had two Xeon Broadwell-EP Worksation builds on my shoulders in the past 3 months, not spending money to test some Ryzen CPU's...
 
Last edited:
Its not suprising for me. It only proves that modern intel CPUs got memory bus wide enough already.

And yes!. Low latency gives the same what big numbers next to "MHz". The point is that Intel and AMD designers are not willing to give us triple or quad channel memory bus (for normal coustomers). Thats an issue! Its easier to use slower and cheaper memory in triple or quad channel than manufacture/buy expensive hi-end memory to GAIN WHAT?

For now, Im on Ryzen 3 2200G with Vega and higher memory transfers do the job. My OCed Vega with OCed DDR4 modules are so fast that 3D performance reached GT1030 level (64bit memory bus).

 
Last edited:
Excellent data :) and it frankly does not matter if you pick best or min or avg. Data as long as you do the same for both Ram’s. It’s not the actual fps that matters, but the difference.. and someone said 3 fps from 42 don’t matter... that is 7+ %.. that is huge from simply running a higher frequency ram... if you happen to play 2 or 3 titles alot which loves high speed memory (like pubg) then it will be much worth it... btw, dont listen to the guys who says latency is the only important value.. games boost more from MHz as long as latency isnt horrible. Bandwidth is important when swapping huge data. Cas will always be better when faster, but frequency allows more bandwidth.. 1 ms ping dont help if you are on a dial up bandwidth...
 
Excellent data :) and it frankly does not matter if you pick best or min or avg. Data as long as you do the same for both Ram’s. It’s not the actual fps that matters, but the difference.. and someone said 3 fps from 42 don’t matter... that is 7+ %.. that is huge from simply running a higher frequency ram... if you happen to play 2 or 3 titles alot which loves high speed memory (like pubg) then it will be much worth it... btw, dont listen to the guys who says latency is the only important value.. games boost more from MHz as long as latency isnt horrible. Bandwidth is important when swapping huge data. Cas will always be better when faster, but frequency allows more bandwidth.. 1 ms ping dont help if you are on a dial up bandwidth...

That only flies if you think that the difference extracted by faster RAM is actually linear. And it is not - at least far from always - and especially not if you're GPU bound, which most people are most of the time.
 
Awesome as always, do you plan or do you have an Ryzen APU's? like the 2400G? I would love to see someone give some benchmarks when running different speed RAM over 3200MHz as I havent found any that seem to benchmark with RAM much over that speed sadly. term paper writing services

Anyone who thinks to take the RISEN APU, take 2 RAM slots, so that it is conditionally 4GB + 4GB, or 8GB + 8GB, and preferably 2-rank (But if you find it), the 2-channel mode, in short, gives an increase of 20-30% t .to. RAM is also video memory, therefore its bandwidth (depending on rank and n-channel) is IMPORTANT.
 
That only flies if you think that the difference extracted by faster RAM is actually linear. And it is not - at least far from always - and especially not if you're GPU bound, which most people are most of the time.
Linear or not, the difference will be there if you use avg over 5 testruns, or if using best/worst.. that has nothing to do with ram speed benefits being linear or not.. and ram cpu bench matters less ofc. If you are gpu bound, though at that point its not the cpu/ram bench you look at anymore :)
 
Back
Top