• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Gaming benchmarks: DDR4 2133 MHz VS DDR4 3000 MHz (Core i7 6700K)

Joined
Feb 22, 2009
Messages
786 (0.13/day)
Processor Ryzen 7 5700X3D
Motherboard Asrock B550 PG Velocita
Cooling Thermalright Silver Arrow 130
Memory G.Skill 4000 MHz DDR4 32 GB
Video Card(s) XFX Radeon RX 7800XT 16 GB
Storage Plextor PX-512M9PEGN 512 GB
Display(s) 1920x1200; 100 Hz
Case Fractal Design North XL
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
It's been proven already that higher frequency RAM increases performance in gaming just like in content creation, whether just by little or considerably, dependent on the game. My problem with all of those showcases - there are far too little games tested. So right now and here i will add more.

For this specific test i have assembled a new bench computer with my own Asus GeForce GTX980 Ti Strix borrowed for the test. You can see the specs of my new PC in the screenshot. Don't mind the Windows 7 "not genuine" notification - i only installed a fresh ISO copy of Windows for testing and i am not activating it just for that! That being said, i obviously use legal Windows 7 for my main Core i7 5775C PC.

ddr4-testing-pc.jpg


Intel Core i7 6700K 4500 MHz OC.
Gigabyte GA-Z170XP-SLI
Corsair Vengeance 2X8 GB DDR4 3000 MHz 15-17-17-35 (XMP1)
Corsair Vengeance 2X8 GB DDR4 2133 MHz 15-15-15-36 (SPD)
Plextor M8PG 256 GB NVME PCI-E 3.0
Asus GeForce GTX980 Ti Strix 6 GB


25 games have been tested. Testing on 1920x1080 resolution using maximum available in-game presets or simply maximum available settings with no AA if possible. All of the testing scenes have been tested many times in a row and the best results were obtained from both DDR4 frequencies using the same memory kit.
-------------------------------------------------------------------------------------------

VIDEO PRESENTATION

ALAN WAKE AMERICAN NIGHTMARE


alan-wake-american-nightmare.jpg


Obvious improvement in minimal and average FPS. Tested 5 times for each frequency.

ALIEN ISOLATION


alien-isolation.jpg


Not much of a difference. Tested 5 times for each frequency.

ARMA 3 APEX


arma-3-apex.jpg


Arma 3 is exceptional... The difference between DDR4 3000 MHz and DDR4 2133 MHz is actually bigger than the difference between GTX980 Ti and GTX970 - that is how much this game is CPU bottle-necked even with Core i7 6700K 4500 MHz!!! I was shocked!!! Tested 5 times for each frequency.

ASHES OF SINGULARITY


ashes-of-singularity.jpg


CPU heavy game? Certainly does not look so from the benchmark. Tested 2 times for each frequency.

ASSASSINS CREED SYNDICATE


assassins-creed-syndicate.jpg


Little difference. Tested 5 times for each frequency.

BATMAN ARKHAM ORIGINS


batman-arkham-origins.jpg


Some performance gain is evident. Tested 3 times for each frequency.

BATTLEFIELD 1


battlefield-1.jpg


Little difference. Tested 5 times for each frequency.

CALL OF DUTY BLACK OPS 3


call-of-duty-black-ops-3.jpg


Not much of a difference. Tested 12 times for each frequency.

COMPANY OF HEROES 2


company-of-heroes-2.jpg


The very first test is the most important and valuable, since every other test after the first is droping performance consistently more and more. The difference is certainly evident.

CRYSIS 3


crysis-3.jpg


I have to say i have seen bigger gains in the internet, perhaps the test is not stressful enough... Tested 7 times for each frequency.

DYING LIGHT


dying-light.jpg


Little difference. Tested 5 times for each frequency.

DOOM


doom.jpg


Little difference. Tested 5 times for each frequency.

DRAGON AGE INQUISITION


dragon-age-inquisition.jpg


There is no difference whatsoever. Tested 5 times for each frequency.

FAR CRY 4


far-cry-4.jpg


Almost no difference. Tested 7 times for each frequency.

MAD MAX FURY ROAD


mad-max-fury-road.jpg


Little difference. Tested 5 times for each frequency.

METRO LAST LIGHT REDUX


metro-last-light-redux.jpg


This is huge! Unfortunately the maximum FPS do not matter that much, yet those high 300+ FPS for 3000 MHz DDR4 were constant in every test. Tested 12 times for each frequency. This test alone took me a whole hour to make!

MIDDLE EARTH SHADOW OF MORDOR


middle-earth-shadow-of-mordor.jpg


Funny how 2133 MHz DDR4 actually won here. Little difference. Tested 4 times for each frequency.

MIRRORS EDGE CATALYST


mirrors-edge-catalyst.jpg


No difference. Tested 5 times for each frequency.

PROJECT CARS


project-cars.jpg


Huge performance difference in this game. Tested 7 times for each frequency.

QUANTUM BREAK


quantum-break.jpg

quantum-break.jpg


Trash this game! This is the worst optimized game i've ever seen and it looks nowhere as good as Crysis 3 or Battlefield 1 (speaking about objects, filtering and lightning, color tone, not face work, which actually looks good). Core i7 6700K 4.5 GHz, 16 GB DDR4 3 GHz and GTX980 Ti OC can not run this game at 1080 with highest preset and no AA? ARE YOU SERIOUS? WTH IS THIS SHIT? Anyway, you will want to have the highest frequency RAM for this game available. Tested 5 times for each frequency.

RAINBOW SIX SIEGE


rainbow-six-siege.jpg


It appears that high frequency RAM improves maximum FPS performance the most. Tested 7 times for each frequency.

RISE OF TOMB RAIDER


rise-of-tomb-raider.jpg


FPS increase from high frequency RAM is obvious. Tested 7 times "Mountain Peak" for each frequency.

THIEF


thief.jpg


Similar FPS increase in Thief just like in Tomb Raider is evident. Tested 4 times for each frequency.

WATCH DOGS 2


watch-dogs-2.jpg


Little difference. Tested 5 times for each frequency.

WITCHER 3 WILD HUNT


witcher-3-wild-hunt.jpg


I could not record any difference, although i've seen obvious improvements from high MHz RAM in this game tested elsewhere. Perhaps this testing scene is just not stressful enough. Tested 5 times for each frequency.
-------------------------------------------------------------------------------------------

CONCLUSIONS

1. Higher frequency RAM does increase gaming performance - mostly by little, but in some cases notably.

2. It is not worth selling your basic DDR4 and getting new high DDR4 from first hand retailer full price.

3. It is worth upgrading your RAM if it will only cost as much extra money, as much extra performance you get in return - that being said, it is not worth selling your 16 GB DDR4 2133 MHz for 60 EU just so get 16 GB DDR4 3000+ for 120 EU for an extra 10 % FPS improvements.
 
Last edited:
I appreciate the effort it took to do this...now I am going to point out some issues inherent with RAM Benchmarking. :P

With RAM you really need to show memory timings for both tested speeds. Pure bandwidth is only a portion of the whole picture when it comes to ram. There is an Apex where tighter timings and high bandwidth meet and that is the sweet spot...even if clocked a little slower with tighter timings you will see an improvement.

How did you get the ram to run from 2133Mhz to 3000Mhz? XMP Profiles? What did the timings look like.
 
I have finished editing the thread.

I appreciate the effort it took to do this...now I am going to point out some issues inherent with RAM Benchmarking. :p

With RAM you really need to show memory timings for both tested speeds. Pure bandwidth is only a portion of the whole picture when it comes to ram. There is an Apex where tighter timings and high bandwidth meet and that is the sweet spot...even if clocked a little slower with tighter timings you will see an improvement.

How did you get the ram to run from 2133Mhz to 3000Mhz? XMP Profiles? What did the timings look like.

Updated! ;)
 
Sometimes none at all, mostly by very little (negligible even... 1 fps), and a notable amount in project cars.

Still hardly a difference for sure. But worth the meager (as I recall them) price difference between 2133/2400 to 3000.
 
@Artas1984 thanks for the time taken for testing all this :respect:

It was a wonderful read!

I second that!

I wonder if a similar test for Ryzen platform would show any tangible difference to this.
 
I second that!

I wonder if a similar test for Ryzen platform would show any tangible difference to this.
you know what my fingers are crossed the Ryzen will show a heck of a more difference but who knows right now :confused:
 
Yeah well worth the minimal difference in price between ultra low 2133 and 3000/3200 DDR4.

Notice: the badly coded games (Arma 3 + Project Cars) profit the most for obvious reasons. Also notice that on these games even more bandwidth would probably help even more. Basically you can't have enough bandwidth for Dual Channel CPUs. Even more so for Ryzen which ties its interconnect between CPU stacks (Infinity fabric) directly to Ram speed.
 
Thread updated.

Included a video presentation today in my youtube channel. All of the pictures represent real testing places. I think you can all recognize which are engine benchmarks and which are custom Fraps benchmarks.

I also tested Left 4 Dead 2 local server in which i "don't get 300 FPS" :D

DDR4 2133: 291/293/296 FPS
DDR4 3000: 292/296/300 FPS

Even in this 2008 game i can see minimal performance boost as i tested it 7 times i a row.
 
Last edited:
Interesting. I was expecting a little more impact on newer titles, but I guess it is what it is....
 
Thread updated.

Included a video presentation today in my youtube channel. All of the pictures represent real testing places. I think you can all recognize which are engine benchmarks and which are custom Fraps benchmarks.

I also tested Left 4 Dead 2 local server in which i "don't get 300 FPS" :D

DDR4 2133: 291/293/296 FPS
DDR4 3000: 292/296/300 FPS

Even in this 2008 game i can see minimal performance boost as i tested it 7 times i a row.
It falls ~1% within the margin of error.

Again, many of these titles are one FPS difference... I wouldn't call those minimal, I'd call it margin of error.. (anything 1% or so honestly...).

The real question is if it's worth it to pay for notable increases in two games while the rest fall within margin of error..even if it is $10 or 10% more...
 
Last edited:
What this testing shows me is how much effort game developers deliver when it comes to higher frame rates. The better looking games have great frame rates without tweaking our nuclear powered babies.


The sad part is this wasnt done with a Ryzen system. (yet?)
 
There is a video here that was making its rounds on the forums... Let me link the thread:

It shows more improvements over Intel to 3K, and looks to scale a bit above that... however then you are getting into an 'is it worth it' situation there too...
 
It falls ~1% within the margin of error.

Again, many of these titles are one FPS difference... I wouldn't call those minimal, I'd call it margin of error.. (anything 1% or so honestly...).

The real question is if it's worth it to pay for notable increases in two games while the rest fall within margin of error..even if it is $10 or 10% more...

Yes and no earthdog...

Take for example the same L4D2: in 14 tests that i have made with both frequencies the DDR4 2133 MHz would score around 291 - 293 average FPS (7 attempts), and DDR4 3000 MHz would score 294 - 296 average FPS (7 attempts). I took only the best results out of those 7 attempts. Even though that is 1 % difference, it's not because of margin of error.

Certainly many games pose accidental results as you say: Ashes of Singularity, Assassins Creed Syndicate, Mirrors Edge Catalyst, Middle Earth Shadow of Mordor, Mad Max Fury Road, Dragon Age Inquisition - which means in those games there is no benefit of higher MHz RAM. I am not counting Crysis 3, Far Cry 4 and Witcher 3 since it's proven elsewhere that higher MHz RAM improves FPS in them, even though in my test there results were minimal.
 
1% is negligible/margin of error... especially in game 'run-throughs' where there isn't a consistent, repeated, scene and actions on the run-through..

It's odd how you throw away your own testing for the other games...your results are your results. If you cant trust them all, you cant trust any of them. Stick to your guns next time... but mention you've seen other get different results. ;)
 
Last edited:
1% is negligible/margin of error... especially in game 'run-throughs' where there isn't a consistent, repeated, scene and actions on the run-through..

It's odd how you throw away your own testing for the other games...your results are your results. If you cant trust them all, you cant trust any of them. Stick to your guns next time... but mention you've seen other get different results. ;)
So your telling him to include every game he tested or none at all. LOL, thats a hell of a potentially long post, please no...
 
He included those titles already, just 'threw away' the results in the discussion with me it seems. But really.. he did already 'stick to his guns' in those titles in the OP... just backed out in our conversation I guess.

EDIT: Literally 18 of the 25 titles tested I would consider had margin of error type increases or simply no gain. These titles had gains of 0,1, or 3 FPS. The 3 FPS I counted as 'no gain/margin of error' were those reaching 262, 150, and 129 FPS respectively. 3 FPS on 42 FPS (I counted as an increase) is what I would call a minimal gain.

Of the 0 FPS gains, their average FPS was 148, 144, and 89.

Of the 1 FPS difference games, their average FPS was 61/95/83/119/161/90/190/146/135/117/144/72 respectively. The games with 1 FPS difference, clearly a margin of error, particularly when they are at/above 100 FPS.

Literally, only 3-4 games will make a difference in performance out of the 25 tested.

I also get concerned when one guesses at things... for example "must not have stressed it enough"... ok... perhaps I can go along with that.. but, why make it up? Perhaps it, like the other titles that had 0 FPS difference, just didn't respond at all and averaged out to none? That shows bias in the result to me considering you expect an increase seemingly on every title (and count 1 FPS increases as anything more than negligible or MOError).

Taking the 'best' run is also concerning... why not the average? Or why not throw away the low and the high to get the mean? The 'best' run is typically anomalous and not the norm.

Sorry to be so blunt artas. I can really get on board with your testing if it wasn't stale (testing things already tested ad nauseum), and the methods are curious to me. Please take this as constructive, and not me being a douche. :)
 
Last edited:
being a douche
It was a good, fun read ... I liked the part about when to buy faster ram
... if it will only cost as much extra money, as much extra performance you get in return
trouble is I never know what's today's price of 1 fps
 
What is really funny....I think our answer is the same, but the work to get there pretty different!!! DDR4 3000 is a sweetspot for price and performance now (remember, other reviews went over this since it was released)!!!! I agree with the end message, but perhaps not the points as stated (see my first post in the thread). :)
 
1% is negligible/margin of error... especially in game 'run-throughs' where there isn't a consistent, repeated, scene and actions on the run-through..

In all of my tests earthdog the scenes and actions were consistent and always the same. But we do agree that a lot of results could have been within the margin or error, and i myself expected far greater differences between DDR4 2133 MHz and DDR4 3000 MHz. I am not saying they were, since i explained it already with the L4D2 example.

I will stick to the point that only the best results from many runs should be obtained, and average results are wrong, because average results include calculating the worst and best data. That being said, many times the worst results occur from accidental HDD lag, program background checks or similar nuisances. Also the best results repeat themselves quite a few times and are never accidental. Here is an example of minimal FPS runs in Rise of Tomb Raider:

DDR4 2133 MHz: 54/71/88/75/89/88/45

DDR4 3000 MHz: 48/76/94/95/77/95/50/

We clearly see that 3 times the GPU manages to pull out the best results around for both RAM frequencies, sometimes the FPS falls short and sometimes a lag occurs resulting in 50 FPS or lower. Seeing this trend i clearly understood that only the best FPS should be taken and the difference between 88/89/88 VS 94/95/95 clearly shows what DDR4 2133 MHz VS DDR4 3000 MHz is all about. I will stick to this method and noone will talk me out of it!
 
Last edited:
Very nice, this should be pinned or something! You should add a 2666mhz 16CAS kit, since those are quite common and in between 2133 and 3000mhz, though you've already made a point with this testing
 
In all of my tests earthdog the scenes and actions were consistent and always the same.

I will stick to the point that only the best results from many runs should be obtained, and average results are wrong, because average results include calculating. Also the I will stick to this method and noone will talk me out of it!
Manual runs throughs are not repeatable. Something changes each and every time. The best you can do manually are scenes 'on rails' for consistency.

I don't agree that using the 'fastest' result is the best. I can't think of one website which does it that way... perhaps we are all wrong though...

You keep on keeping on, people eat your testing up like it's The Gospel. :)
 
Thank you for doing this. I found it to be very interesting. As many have said it's the cpu overclocking having better results in improving FPS in games.
 
Manual runs throughs are not repeatable. Something changes each and every time. The best you can do manually are scenes 'on rails' for consistency.

I don't agree that using the 'fastest' result is the best. I can't think of one website which does it that way... perhaps we are all wrong though...

You keep on keeping on, people eat your testing up like it's The Gospel. :)

Did not i explain to you earthdog why i choose only the best results from different contenders?

One more time:

DDR4 2133 MHz: 54/71/88/75/89/88/45

DDR4 3000 MHz: 48/76/94/95/77/95/50/

If i was to calculate only the average data, i would include accidental lag caused by other hardware and it might turn out that DDR4 2133 MHz is faster than DDR4 3000 MHz. I must only take the best, yet repetitive, results that the tested hardware contender is able to produce without being hindered by any other hardware parts. This applies to processors and video cards too. I can not imagine how else i am to explain it. Don't act like a stubborn child earthdog pretending you do not understand what i thoughtfully have explained, since i gave you a very good example of why i am doing it my way.
 
Last edited:
Back
Top