• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

G.SKILL Showcases DDR5-7000 CL40 Extreme Speed Memory

That sucks. Does that really make me want this ram less? Nope. The fun is trying to get it stable in whatever boards you get your hands on.
The thing here is that even Intel has officially said that you need a two DIMM board to get the best out of DDR5 memory and it seems like there might be a much harder to pass limit on four DIMM boards. I guess we'll find out soon enough.

DRAMADL.png

 
I know for a fact that 10900k is the best CPU for Warzone (most CPU demanding game out there), because it can be paired with really low latency RAM (like 4200 C16, for example) and has 10 cores on a monolhitic die. It averages 220fps with 190ish lows. While 5950x averages 200 with 170ish lows. 5900x, 5800x, 5600x, 11900k, 11700k etc are crap in comparasion.

I have no doubts that Warzone will have worse fps when using DDR5. Latency is the most important thing for CPU bound games. My question was... who is going to buy a mainstream platform and take "advantage" of higher speed DDR5 ram on certain applications that clearly would benefit more from other platforms? That´s what I don´t understand. These are gaming CPUs above anything. Don´t tell me you are going to buy a mainstream platform for 24/7 rendering, makes no sense.

So this is about to happen, people spending loads on new fancy DDR5 just to have worse performance, because the timings are not quit there yet. And if you think about "long term", then waiting for Raptor Lake/Zen 4 would be way better option, as by that time DDR5 will for sure be faster.

People will have a lot of surprises in 2 days when the reviews drop... quick tip, look up for the lower end boards with DDR4... or just skip this gen. Because, oh boy, that 12900k with dual rank DDR4 at 3866 C14 in Gear 1, will completly obliterate any DDR5 config...
It's fortunate that there will be ddr4 boards then, no? What I want to know is, if there is any indication there will be any dual type boards, like there used to be for ddr1/ddr2 and then later ddr2/ddr3. I don't remember there being any ddr3/ddr4, so that means probably not, but maybe...

Hmmm, there seem to have been at least a couple after all:
DDR3 and DDR4 on one Motherboard? | OC3D News (overclock3d.net)
ASRock > B150M Combo-G
 
Last edited:
I know for a fact that 10900k is the best CPU for Warzone (most CPU demanding game out there), because it can be paired with really low latency RAM (like 4200 C16, for example) and has 10 cores on a monolhitic die. It averages 220fps with 190ish lows. While 5950x averages 200 with 170ish lows. 5900x, 5800x, 5600x, 11900k, 11700k etc are crap in comparasion.

I have no doubts that Warzone will have worse fps when using DDR5. Latency is the most important thing for CPU bound games. My question was... who is going to buy a mainstream platform and take "advantage" of higher speed DDR5 ram on certain applications that clearly would benefit more from other platforms? That´s what I don´t understand. These are gaming CPUs above anything. Don´t tell me you are going to buy a mainstream platform for 24/7 rendering, makes no sense.

So this is about to happen, people spending loads on new fancy DDR5 just to have worse performance, because the timings are not quit there yet. And if you think about "long term", then waiting for Raptor Lake/Zen 4 would be way better option, as by that time DDR5 will for sure be faster.

People will have a lot of surprises in 2 days when the reviews drop... quick tip, look up for the lower end boards with DDR4... or just skip this gen. Because, oh boy, that 12900k with dual rank DDR4 at 3866 C14 in Gear 1, will completly obliterate any DDR5 config...

I think your problem is that you're translating "Warzone" to mean "All games". We also really don't know the impact of cache / ring speed on this architecture. Also warzone is only faster on very high OC memory and CPU on 10900K. Out of the box using mundane hardware it's faster on 5950X.

To give an example, framechasers (youtube) did some tests and got the below on a 10900K. the difference between DDR4-4400 with a ring of 40 vs a ring of 50 is 9% or 18fps but it only lowers latency 2.8 ns or about 6%. Now this is not the same as the memory latency discussed, this is total. Total is what we don't yet know.

1635948352416.png
 
50 ring bro... framechasers and their OCs. :kookoo:

I can't imagine a world where that thing doesn't just randomly lock up.

Gen 1 DDR5 vs the fastest DDR4 is not going to win... that's like comparing DDR4 2133 CL15 to the fastest DDR3 kits - it will get smoked. It's going to be at least few months until worthwhile DDR5 kits are available.
 
Last edited:
Just to illustrate something regarding latency - notice how bad latency is on this Trident Z Royal DDR4 is, despite CL 17 :

1635957030055.png


But notice how well it performs in gaming :

1635956995889.png
 
I know for a fact that 10900k is the best CPU for Warzone (most CPU demanding game out there), because it can be paired with really low latency RAM (like 4200 C16, for example) and has 10 cores on a monolhitic die. It averages 220fps with 190ish lows. While 5950x averages 200 with 170ish lows. 5900x, 5800x, 5600x, 11900k, 11700k etc are crap in comparasion.

I have no doubts that Warzone will have worse fps when using DDR5. Latency is the most important thing for CPU bound games. My question was... who is going to buy a mainstream platform and take "advantage" of higher speed DDR5 ram on certain applications that clearly would benefit more from other platforms? That´s what I don´t understand. These are gaming CPUs above anything. Don´t tell me you are going to buy a mainstream platform for 24/7 rendering, makes no sense.

So this is about to happen, people spending loads on new fancy DDR5 just to have worse performance, because the timings are not quit there yet. And if you think about "long term", then waiting for Raptor Lake/Zen 4 would be way better option, as by that time DDR5 will for sure be faster.

People will have a lot of surprises in 2 days when the reviews drop... quick tip, look up for the lower end boards with DDR4... or just skip this gen. Because, oh boy, that 12900k with dual rank DDR4 at 3866 C14 in Gear 1, will completly obliterate any DDR5 config...

Warzone is trash and should not be used as a benchmark.
 
50 ring bro... framechasers and their OCs. :kookoo:

I can't imagine a world where that thing doesn't just randomly lock up.

Gen 1 DDR5 vs the fastest DDR4 is not going to win... that's like comparing DDR4 2133 CL15 to the fastest DDR3 kits - it will get smoked. It's going to be at least few months until worthwhile DDR5 kits are available.

Watching that guy basically got me back into tweaking and such. I think he's going to like AL, a lot of reports of big overclock ability.

I really don't know on DDR5vsDDR4. I was around for DDR3->DDR4 and earlier transitions and yeah that's how it's always been. But for someone on say DDR4-3200 CL16 or some such, baseline DDR4-4800 (JEDEC) may very well be faster. We didn't get that heavy a MT/s bump when we went from DDR3-1600 to DDR4-1866/2133. I mean technically there is a DDR3-2133 JEDEC standard.

ofc the assumption here is that one is ok with spending an extra 100-200 on RAM if they are going DDR5 too.
 
Just to illustrate something regarding latency - notice how bad latency is on this Trident Z Royal DDR4 is, despite CL 17 :

But notice how well it performs in gaming :

That Sandra memory benchmark seems to be really random. I would use their Multicore benchmark instead, but hey, this is Tom's hardware. Can't expect that much.
 
Warzone is trash and should not be used as a benchmark.

It should, considering it scales with basically everything.... and is also one of the most played games right now in the world. And one that makes overclocking and tuning actually FUN and USEFUL. Something that isn´t useful anymore in other games like overwatch, cs, dota, League, where any decent 3 year 6 core will max up all of them.

So yeah, I would say Warzone should really be used as a benchmark.

Plus, that´s not the only game where 10900k completly smokes any other chip. Idk if you tried Halo Infinite Multiplayer Beta, but 10900k again dominated the charts and 5950x stood no chance. Won´t even mention 5800x or 5600x. Same happens in Escape From Tarkov or literally any other big map/world online game. 10900k is too good, it will be a 10 year chip just like sandy bridge. 10 cores on a single die and ability to use very fast ram in gear 1 is too good, no competition.

I am sure Alder Lake will beat 10900k on every console Port single player game or stuff like that, just like Zen 3 does. But in massive multiplayer games where latency is everything we all know 10900k will keep dominating. No chance for Alder Lake, not with this memory config.
 
It should, considering it scales with basically everything.... and is also one of the most played games right now in the world. And one that makes overclocking and tuning actually FUN and USEFUL. Something that isn´t useful anymore in other games like overwatch, cs, dota, League, where any decent 3 year 6 core will max up all of them.

So yeah, I would say Warzone should really be used as a benchmark.

Plus, that´s not the only game where 10900k completly smokes any other chip. Idk if you tried Halo Infinite Multiplayer Beta, but 10900k again dominated the charts and 5950x stood no chance. Won´t even mention 5800x or 5600x. Same happens in Escape From Tarkov or literally any other big map/world online game. 10900k is too good, it will be a 10 year chip just like sandy bridge. 10 cores on a single die and ability to use very fast ram in gear 1 is too good, no competition.

I am sure Alder Lake will beat 10900k on every console Port single player game or stuff like that, just like Zen 3 does. But in massive multiplayer games where latency is everything we all know 10900k will keep dominating. No chance for Alder Lake, not with this memory config.

I have a feeling you're going to be right on that. I think there are some scenarios where the specific instructions being executed are running so fast on every modern CPU, that the only thing that matters is getting the data to be executed on into the pipe as quickly as possible. OTOH, we do have Alder Lake DDR4 boards and they are not all low end.

Three more hours...
 
It should, considering it scales with basically everything.... and is also one of the most played games right now in the world. And one that makes overclocking and tuning actually FUN and USEFUL. Something that isn´t useful anymore in other games like overwatch, cs, dota, League, where any decent 3 year 6 core will max up all of them.

So yeah, I would say Warzone should really be used as a benchmark.

Plus, that´s not the only game where 10900k completly smokes any other chip. Idk if you tried Halo Infinite Multiplayer Beta, but 10900k again dominated the charts and 5950x stood no chance. Won´t even mention 5800x or 5600x. Same happens in Escape From Tarkov or literally any other big map/world online game. 10900k is too good, it will be a 10 year chip just like sandy bridge. 10 cores on a single die and ability to use very fast ram in gear 1 is too good, no competition.

I am sure Alder Lake will beat 10900k on every console Port single player game or stuff like that, just like Zen 3 does. But in massive multiplayer games where latency is everything we all know 10900k will keep dominating. No chance for Alder Lake, not with this memory config.
The game has many issues due to its engine and activision always manages to screw something up with every patch. Plus it's multiplayer like the other games you mentioned so :sleep: boring and really uninteresting. If that's where Intel chooses to dominate, I say great. Take all the trash games and we'll keep the rest.

I'm really excited for Alder Lake though.
 
The game has many issues due to its engine and activision always manages to screw something up with every patch. Plus it's multiplayer like the other games you mentioned so :sleep: boring and really uninteresting. If that's where Intel chooses to dominate, I say great. Take all the trash games and we'll keep the rest.

I'm really excited for Alder Lake though.

So where do you want the frames? I mean, I use a 240hz monitor and can´t use anything else for a multiplayer game. Give me frames. They provide lower response time, easier to aim, more fun, smoother!

But for freaking Tomb Raider or Day´s Gone? Dude I literally lock all of that stuff at 60fps, try to use as much resolution and detail as possible and just have fun. Single Player gamers are the happiest ones. They can literallly get an i3 10100F, pair it with baseline 3200mhz ram and run every game at 60fps. For 80€, passive or 400rpm inaudible cooling and 30 watts while gaming :|
 
Sorry, but even with those high speeds and bandwidth, the latencies are still crap.
And this is proven by using a proper calculator like this one:

On topic, those sticks should be CL32 at least, while ultra-low latency should be called anything bellow that value.

Anything above 10ns is junk, while good RAMs are the ones within 9nm or lower latencies.
Guess we need to wait for the 13rd Intel Core generation until we can get those.... Sadly.
 
Last edited:
It should, considering it scales with basically everything.... and is also one of the most played games right now in the world. And one that makes overclocking and tuning actually FUN and USEFUL. Something that isn´t useful anymore in other games like overwatch, cs, dota, League, where any decent 3 year 6 core will max up all of them.

So yeah, I would say Warzone should really be used as a benchmark.

Plus, that´s not the only game where 10900k completly smokes any other chip. Idk if you tried Halo Infinite Multiplayer Beta, but 10900k again dominated the charts and 5950x stood no chance. Won´t even mention 5800x or 5600x. Same happens in Escape From Tarkov or literally any other big map/world online game. 10900k is too good, it will be a 10 year chip just like sandy bridge. 10 cores on a single die and ability to use very fast ram in gear 1 is too good, no competition.

I am sure Alder Lake will beat 10900k on every console Port single player game or stuff like that, just like Zen 3 does. But in massive multiplayer games where latency is everything we all know 10900k will keep dominating. No chance for Alder Lake, not with this memory config.
Alder Lake w/DDR4 seems to perform well in games.

alderlake1.jpg
 
(CAS latency/RAM clock speed) x 2000 = latency in nanoseconds

DDR5 7000 CL40

40/7000 *2000 = 11.4ns

DDR4-3600 CL16 (good stuff)

16/3600 * 2000 = 8.9ns

DDR4-3200 CL18 (common stuff)

18/3200 * 2000 = 11.25


This is actually getting there on latency, a whole lot faster than what happened with DDR4 vs DDR3. Friday NDA lift will be an interesting day.
My PC 3200 timings are 14-14-14-34 = 14/3200 * 2000 = 8.75ns Ram is G.Skill Flare X
My PC3600 timings are 14-14-14-34 = 14/3600 * 2000 = 7.78ns Ram is OLOY Blade PC4000 down clocked to PC 3600.

OLOY PC4000 timings are 15-15-15-36 14/4000 * 200 = 7.5ns

You could see why I clocked down. It was to use less voltage and pretty much get the same performance. This is also why I generally get the best RAM if I can get it on sale. Because my G.Skill is still a viable option for my Ryzen CPU's.
 
Back
Top