Tuesday, November 2nd 2021

G.SKILL Showcases DDR5-7000 CL40 Extreme Speed Memory

G.SKILL International Enterprise Co., Ltd., the world's leading manufacturer of extreme performance memory and gaming peripherals, is thrilled to announce the achievement of DDR5-7000 CL40-40-40-76 32 GB (2x16 GB) extreme speed, passing the Memtest stability test. 7000MT/s memory speed is an exciting milestone, as it was only seen under liquid nitrogen sub-zero temperature cooling not long ago in overclocking records. Accomplished with high-performance Samsung DDR5 components, this extreme speed memory is truly worthy of the G.SKILL flagship Trident Z5 family classification.

G.SKILL has been dedicated to develop the fastest possible DDR5 memory on the latest 12th Gen Intel Core desktop processors and Intel Z690 chipset motherboards. Today, G.SKILL is proud to announce the feat of reaching DDR5-7000 extreme speed, while maintaining an ultra-low CAS latency timing of CL40-40-40-76. The memory modules that reached this monumental achievement is built with high-performance Samsung DDR5 components, and has shown to be stable under Memtest. Please refer to the screenshot below.
"We are seeing amazing overclocking potential of DDR5 memory on the latest 12th Gen Intel Core desktop processors and Intel Z690 chipset motherboards," says Tequila Huang, Corporate Vice President of G.SKILL International. "DDR5-7000 is an incredible milestone for us, and we will continue to work with our industry partners to develop ever-faster DDR5 memory for PC enthusiasts and overclockers."

DDR5-6666 CL40 - Fastest on Intel XMP 3.0 List
Following in the footsteps of extreme-performance memory kits, the G.SKILL DDR5-6666 CL40 memory kit is currently the fastest memory kit on the Intel XMP 3.0 memory list. To view the list, please refer to this page.
Add your own comment

45 Comments on G.SKILL Showcases DDR5-7000 CL40 Extreme Speed Memory

#26
HenrySomeone
CobainI know for a fact that 10900k is the best CPU for Warzone (most CPU demanding game out there), because it can be paired with really low latency RAM (like 4200 C16, for example) and has 10 cores on a monolhitic die. It averages 220fps with 190ish lows. While 5950x averages 200 with 170ish lows. 5900x, 5800x, 5600x, 11900k, 11700k etc are crap in comparasion.

I have no doubts that Warzone will have worse fps when using DDR5. Latency is the most important thing for CPU bound games. My question was... who is going to buy a mainstream platform and take "advantage" of higher speed DDR5 ram on certain applications that clearly would benefit more from other platforms? That´s what I don´t understand. These are gaming CPUs above anything. Don´t tell me you are going to buy a mainstream platform for 24/7 rendering, makes no sense.

So this is about to happen, people spending loads on new fancy DDR5 just to have worse performance, because the timings are not quit there yet. And if you think about "long term", then waiting for Raptor Lake/Zen 4 would be way better option, as by that time DDR5 will for sure be faster.

People will have a lot of surprises in 2 days when the reviews drop... quick tip, look up for the lower end boards with DDR4... or just skip this gen. Because, oh boy, that 12900k with dual rank DDR4 at 3866 C14 in Gear 1, will completly obliterate any DDR5 config...
It's fortunate that there will be ddr4 boards then, no? What I want to know is, if there is any indication there will be any dual type boards, like there used to be for ddr1/ddr2 and then later ddr2/ddr3. I don't remember there being any ddr3/ddr4, so that means probably not, but maybe...

Hmmm, there seem to have been at least a couple after all:
DDR3 and DDR4 on one Motherboard? | OC3D News (overclock3d.net)
ASRock > B150M Combo-G
Posted on Reply
#27
RandallFlagg
CobainI know for a fact that 10900k is the best CPU for Warzone (most CPU demanding game out there), because it can be paired with really low latency RAM (like 4200 C16, for example) and has 10 cores on a monolhitic die. It averages 220fps with 190ish lows. While 5950x averages 200 with 170ish lows. 5900x, 5800x, 5600x, 11900k, 11700k etc are crap in comparasion.

I have no doubts that Warzone will have worse fps when using DDR5. Latency is the most important thing for CPU bound games. My question was... who is going to buy a mainstream platform and take "advantage" of higher speed DDR5 ram on certain applications that clearly would benefit more from other platforms? That´s what I don´t understand. These are gaming CPUs above anything. Don´t tell me you are going to buy a mainstream platform for 24/7 rendering, makes no sense.

So this is about to happen, people spending loads on new fancy DDR5 just to have worse performance, because the timings are not quit there yet. And if you think about "long term", then waiting for Raptor Lake/Zen 4 would be way better option, as by that time DDR5 will for sure be faster.

People will have a lot of surprises in 2 days when the reviews drop... quick tip, look up for the lower end boards with DDR4... or just skip this gen. Because, oh boy, that 12900k with dual rank DDR4 at 3866 C14 in Gear 1, will completly obliterate any DDR5 config...
I think your problem is that you're translating "Warzone" to mean "All games". We also really don't know the impact of cache / ring speed on this architecture. Also warzone is only faster on very high OC memory and CPU on 10900K. Out of the box using mundane hardware it's faster on 5950X.

To give an example, framechasers (youtube) did some tests and got the below on a 10900K. the difference between DDR4-4400 with a ring of 40 vs a ring of 50 is 9% or 18fps but it only lowers latency 2.8 ns or about 6%. Now this is not the same as the memory latency discussed, this is total. Total is what we don't yet know.

Posted on Reply
#28
phanbuey
50 ring bro... framechasers and their OCs. :kookoo:

I can't imagine a world where that thing doesn't just randomly lock up.

Gen 1 DDR5 vs the fastest DDR4 is not going to win... that's like comparing DDR4 2133 CL15 to the fastest DDR3 kits - it will get smoked. It's going to be at least few months until worthwhile DDR5 kits are available.
Posted on Reply
#29
RandallFlagg
Just to illustrate something regarding latency - notice how bad latency is on this Trident Z Royal DDR4 is, despite CL 17 :



But notice how well it performs in gaming :

Posted on Reply
#30
Unregistered
CobainI know for a fact that 10900k is the best CPU for Warzone (most CPU demanding game out there), because it can be paired with really low latency RAM (like 4200 C16, for example) and has 10 cores on a monolhitic die. It averages 220fps with 190ish lows. While 5950x averages 200 with 170ish lows. 5900x, 5800x, 5600x, 11900k, 11700k etc are crap in comparasion.

I have no doubts that Warzone will have worse fps when using DDR5. Latency is the most important thing for CPU bound games. My question was... who is going to buy a mainstream platform and take "advantage" of higher speed DDR5 ram on certain applications that clearly would benefit more from other platforms? That´s what I don´t understand. These are gaming CPUs above anything. Don´t tell me you are going to buy a mainstream platform for 24/7 rendering, makes no sense.

So this is about to happen, people spending loads on new fancy DDR5 just to have worse performance, because the timings are not quit there yet. And if you think about "long term", then waiting for Raptor Lake/Zen 4 would be way better option, as by that time DDR5 will for sure be faster.

People will have a lot of surprises in 2 days when the reviews drop... quick tip, look up for the lower end boards with DDR4... or just skip this gen. Because, oh boy, that 12900k with dual rank DDR4 at 3866 C14 in Gear 1, will completly obliterate any DDR5 config...
Warzone is trash and should not be used as a benchmark.
#31
RandallFlagg
phanbuey50 ring bro... framechasers and their OCs. :kookoo:

I can't imagine a world where that thing doesn't just randomly lock up.

Gen 1 DDR5 vs the fastest DDR4 is not going to win... that's like comparing DDR4 2133 CL15 to the fastest DDR3 kits - it will get smoked. It's going to be at least few months until worthwhile DDR5 kits are available.
Watching that guy basically got me back into tweaking and such. I think he's going to like AL, a lot of reports of big overclock ability.

I really don't know on DDR5vsDDR4. I was around for DDR3->DDR4 and earlier transitions and yeah that's how it's always been. But for someone on say DDR4-3200 CL16 or some such, baseline DDR4-4800 (JEDEC) may very well be faster. We didn't get that heavy a MT/s bump when we went from DDR3-1600 to DDR4-1866/2133. I mean technically there is a DDR3-2133 JEDEC standard.

ofc the assumption here is that one is ok with spending an extra 100-200 on RAM if they are going DDR5 too.
Posted on Reply
#32
Unregistered
RandallFlaggJust to illustrate something regarding latency - notice how bad latency is on this Trident Z Royal DDR4 is, despite CL 17 :

But notice how well it performs in gaming :
That Sandra memory benchmark seems to be really random. I would use their Multicore benchmark instead, but hey, this is Tom's hardware. Can't expect that much.
#33
HenrySomeone
rares495Warzone is trash and should not be used as a benchmark.
You'd be saying otherwise if Zen3 was the best performing cpu in it in a heartbeat ;)
Posted on Reply
#34
Unregistered
HenrySomeoneYou'd be saying otherwise if Zen3 was the best performing cpu in it in a heartbeat ;)
No, I wouldn't. I'm not 12 years old.
#36
Cobain
rares495Warzone is trash and should not be used as a benchmark.
It should, considering it scales with basically everything.... and is also one of the most played games right now in the world. And one that makes overclocking and tuning actually FUN and USEFUL. Something that isn´t useful anymore in other games like overwatch, cs, dota, League, where any decent 3 year 6 core will max up all of them.

So yeah, I would say Warzone should really be used as a benchmark.

Plus, that´s not the only game where 10900k completly smokes any other chip. Idk if you tried Halo Infinite Multiplayer Beta, but 10900k again dominated the charts and 5950x stood no chance. Won´t even mention 5800x or 5600x. Same happens in Escape From Tarkov or literally any other big map/world online game. 10900k is too good, it will be a 10 year chip just like sandy bridge. 10 cores on a single die and ability to use very fast ram in gear 1 is too good, no competition.

I am sure Alder Lake will beat 10900k on every console Port single player game or stuff like that, just like Zen 3 does. But in massive multiplayer games where latency is everything we all know 10900k will keep dominating. No chance for Alder Lake, not with this memory config.
Posted on Reply
#37
RandallFlagg
CobainIt should, considering it scales with basically everything.... and is also one of the most played games right now in the world. And one that makes overclocking and tuning actually FUN and USEFUL. Something that isn´t useful anymore in other games like overwatch, cs, dota, League, where any decent 3 year 6 core will max up all of them.

So yeah, I would say Warzone should really be used as a benchmark.

Plus, that´s not the only game where 10900k completly smokes any other chip. Idk if you tried Halo Infinite Multiplayer Beta, but 10900k again dominated the charts and 5950x stood no chance. Won´t even mention 5800x or 5600x. Same happens in Escape From Tarkov or literally any other big map/world online game. 10900k is too good, it will be a 10 year chip just like sandy bridge. 10 cores on a single die and ability to use very fast ram in gear 1 is too good, no competition.

I am sure Alder Lake will beat 10900k on every console Port single player game or stuff like that, just like Zen 3 does. But in massive multiplayer games where latency is everything we all know 10900k will keep dominating. No chance for Alder Lake, not with this memory config.
I have a feeling you're going to be right on that. I think there are some scenarios where the specific instructions being executed are running so fast on every modern CPU, that the only thing that matters is getting the data to be executed on into the pipe as quickly as possible. OTOH, we do have Alder Lake DDR4 boards and they are not all low end.

Three more hours...
Posted on Reply
#39
Unregistered
CobainIt should, considering it scales with basically everything.... and is also one of the most played games right now in the world. And one that makes overclocking and tuning actually FUN and USEFUL. Something that isn´t useful anymore in other games like overwatch, cs, dota, League, where any decent 3 year 6 core will max up all of them.

So yeah, I would say Warzone should really be used as a benchmark.

Plus, that´s not the only game where 10900k completly smokes any other chip. Idk if you tried Halo Infinite Multiplayer Beta, but 10900k again dominated the charts and 5950x stood no chance. Won´t even mention 5800x or 5600x. Same happens in Escape From Tarkov or literally any other big map/world online game. 10900k is too good, it will be a 10 year chip just like sandy bridge. 10 cores on a single die and ability to use very fast ram in gear 1 is too good, no competition.

I am sure Alder Lake will beat 10900k on every console Port single player game or stuff like that, just like Zen 3 does. But in massive multiplayer games where latency is everything we all know 10900k will keep dominating. No chance for Alder Lake, not with this memory config.
The game has many issues due to its engine and activision always manages to screw something up with every patch. Plus it's multiplayer like the other games you mentioned so :sleep: boring and really uninteresting. If that's where Intel chooses to dominate, I say great. Take all the trash games and we'll keep the rest.

I'm really excited for Alder Lake though.
#40
HenrySomeone
Well pal, it's multiplayer games where frames happen to matter the most, so... :rolleyes:
Posted on Reply
#41
Cobain
rares495The game has many issues due to its engine and activision always manages to screw something up with every patch. Plus it's multiplayer like the other games you mentioned so :sleep: boring and really uninteresting. If that's where Intel chooses to dominate, I say great. Take all the trash games and we'll keep the rest.

I'm really excited for Alder Lake though.
So where do you want the frames? I mean, I use a 240hz monitor and can´t use anything else for a multiplayer game. Give me frames. They provide lower response time, easier to aim, more fun, smoother!

But for freaking Tomb Raider or Day´s Gone? Dude I literally lock all of that stuff at 60fps, try to use as much resolution and detail as possible and just have fun. Single Player gamers are the happiest ones. They can literallly get an i3 10100F, pair it with baseline 3200mhz ram and run every game at 60fps. For 80€, passive or 400rpm inaudible cooling and 30 watts while gaming :|
Posted on Reply
#42
TheLostSwede
News Editor
cadavecanotice 6000 not even listed. :p
Yeah, because Intel doesn't support overclocking... Yet offers XMP profiles... o_O
Posted on Reply
#43
Prima.Vera
Sorry, but even with those high speeds and bandwidth, the latencies are still crap.
And this is proven by using a proper calculator like this one:
notkyon.moe/ram-latency.htm

On topic, those sticks should be CL32 at least, while ultra-low latency should be called anything bellow that value.

Anything above 10ns is junk, while good RAMs are the ones within 9nm or lower latencies.
Guess we need to wait for the 13rd Intel Core generation until we can get those.... Sadly.
Posted on Reply
#44
Why_Me
CobainIt should, considering it scales with basically everything.... and is also one of the most played games right now in the world. And one that makes overclocking and tuning actually FUN and USEFUL. Something that isn´t useful anymore in other games like overwatch, cs, dota, League, where any decent 3 year 6 core will max up all of them.

So yeah, I would say Warzone should really be used as a benchmark.

Plus, that´s not the only game where 10900k completly smokes any other chip. Idk if you tried Halo Infinite Multiplayer Beta, but 10900k again dominated the charts and 5950x stood no chance. Won´t even mention 5800x or 5600x. Same happens in Escape From Tarkov or literally any other big map/world online game. 10900k is too good, it will be a 10 year chip just like sandy bridge. 10 cores on a single die and ability to use very fast ram in gear 1 is too good, no competition.

I am sure Alder Lake will beat 10900k on every console Port single player game or stuff like that, just like Zen 3 does. But in massive multiplayer games where latency is everything we all know 10900k will keep dominating. No chance for Alder Lake, not with this memory config.
Alder Lake w/DDR4 seems to perform well in games.

Posted on Reply
#45
Icon Charlie
RandallFlagg(CAS latency/RAM clock speed) x 2000 = latency in nanoseconds

DDR5 7000 CL40

40/7000 *2000 = 11.4ns

DDR4-3600 CL16 (good stuff)

16/3600 * 2000 = 8.9ns

DDR4-3200 CL18 (common stuff)

18/3200 * 2000 = 11.25


This is actually getting there on latency, a whole lot faster than what happened with DDR4 vs DDR3. Friday NDA lift will be an interesting day.
My PC 3200 timings are 14-14-14-34 = 14/3200 * 2000 = 8.75ns Ram is G.Skill Flare X
My PC3600 timings are 14-14-14-34 = 14/3600 * 2000 = 7.78ns Ram is OLOY Blade PC4000 down clocked to PC 3600.

OLOY PC4000 timings are 15-15-15-36 14/4000 * 200 = 7.5ns

You could see why I clocked down. It was to use less voltage and pretty much get the same performance. This is also why I generally get the best RAM if I can get it on sale. Because my G.Skill is still a viable option for my Ryzen CPU's.
Posted on Reply
Add your own comment
May 9th, 2024 16:15 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts