• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

G.SKILL Showcases DDR5-7000 CL40 Extreme Speed Memory

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,272 (2.28/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
That sucks. Does that really make me want this ram less? Nope. The fun is trying to get it stable in whatever boards you get your hands on.
The thing here is that even Intel has officially said that you need a two DIMM board to get the best out of DDR5 memory and it seems like there might be a much harder to pass limit on four DIMM boards. I guess we'll find out soon enough.


 
Joined
Apr 16, 2019
Messages
632 (0.34/day)
I know for a fact that 10900k is the best CPU for Warzone (most CPU demanding game out there), because it can be paired with really low latency RAM (like 4200 C16, for example) and has 10 cores on a monolhitic die. It averages 220fps with 190ish lows. While 5950x averages 200 with 170ish lows. 5900x, 5800x, 5600x, 11900k, 11700k etc are crap in comparasion.

I have no doubts that Warzone will have worse fps when using DDR5. Latency is the most important thing for CPU bound games. My question was... who is going to buy a mainstream platform and take "advantage" of higher speed DDR5 ram on certain applications that clearly would benefit more from other platforms? That´s what I don´t understand. These are gaming CPUs above anything. Don´t tell me you are going to buy a mainstream platform for 24/7 rendering, makes no sense.

So this is about to happen, people spending loads on new fancy DDR5 just to have worse performance, because the timings are not quit there yet. And if you think about "long term", then waiting for Raptor Lake/Zen 4 would be way better option, as by that time DDR5 will for sure be faster.

People will have a lot of surprises in 2 days when the reviews drop... quick tip, look up for the lower end boards with DDR4... or just skip this gen. Because, oh boy, that 12900k with dual rank DDR4 at 3866 C14 in Gear 1, will completly obliterate any DDR5 config...
It's fortunate that there will be ddr4 boards then, no? What I want to know is, if there is any indication there will be any dual type boards, like there used to be for ddr1/ddr2 and then later ddr2/ddr3. I don't remember there being any ddr3/ddr4, so that means probably not, but maybe...

Hmmm, there seem to have been at least a couple after all:
DDR3 and DDR4 on one Motherboard? | OC3D News (overclock3d.net)
ASRock > B150M Combo-G
 
Last edited:
Joined
Jan 27, 2015
Messages
1,650 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
I know for a fact that 10900k is the best CPU for Warzone (most CPU demanding game out there), because it can be paired with really low latency RAM (like 4200 C16, for example) and has 10 cores on a monolhitic die. It averages 220fps with 190ish lows. While 5950x averages 200 with 170ish lows. 5900x, 5800x, 5600x, 11900k, 11700k etc are crap in comparasion.

I have no doubts that Warzone will have worse fps when using DDR5. Latency is the most important thing for CPU bound games. My question was... who is going to buy a mainstream platform and take "advantage" of higher speed DDR5 ram on certain applications that clearly would benefit more from other platforms? That´s what I don´t understand. These are gaming CPUs above anything. Don´t tell me you are going to buy a mainstream platform for 24/7 rendering, makes no sense.

So this is about to happen, people spending loads on new fancy DDR5 just to have worse performance, because the timings are not quit there yet. And if you think about "long term", then waiting for Raptor Lake/Zen 4 would be way better option, as by that time DDR5 will for sure be faster.

People will have a lot of surprises in 2 days when the reviews drop... quick tip, look up for the lower end boards with DDR4... or just skip this gen. Because, oh boy, that 12900k with dual rank DDR4 at 3866 C14 in Gear 1, will completly obliterate any DDR5 config...

I think your problem is that you're translating "Warzone" to mean "All games". We also really don't know the impact of cache / ring speed on this architecture. Also warzone is only faster on very high OC memory and CPU on 10900K. Out of the box using mundane hardware it's faster on 5950X.

To give an example, framechasers (youtube) did some tests and got the below on a 10900K. the difference between DDR4-4400 with a ring of 40 vs a ring of 50 is 9% or 18fps but it only lowers latency 2.8 ns or about 6%. Now this is not the same as the memory latency discussed, this is total. Total is what we don't yet know.

1635948352416.png
 
Joined
Nov 13, 2007
Messages
10,263 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
50 ring bro... framechasers and their OCs. :kookoo:

I can't imagine a world where that thing doesn't just randomly lock up.

Gen 1 DDR5 vs the fastest DDR4 is not going to win... that's like comparing DDR4 2133 CL15 to the fastest DDR3 kits - it will get smoked. It's going to be at least few months until worthwhile DDR5 kits are available.
 
Last edited:
Joined
Jan 27, 2015
Messages
1,650 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Just to illustrate something regarding latency - notice how bad latency is on this Trident Z Royal DDR4 is, despite CL 17 :

1635957030055.png


But notice how well it performs in gaming :

1635956995889.png
 
D

Deleted member 215115

Guest
I know for a fact that 10900k is the best CPU for Warzone (most CPU demanding game out there), because it can be paired with really low latency RAM (like 4200 C16, for example) and has 10 cores on a monolhitic die. It averages 220fps with 190ish lows. While 5950x averages 200 with 170ish lows. 5900x, 5800x, 5600x, 11900k, 11700k etc are crap in comparasion.

I have no doubts that Warzone will have worse fps when using DDR5. Latency is the most important thing for CPU bound games. My question was... who is going to buy a mainstream platform and take "advantage" of higher speed DDR5 ram on certain applications that clearly would benefit more from other platforms? That´s what I don´t understand. These are gaming CPUs above anything. Don´t tell me you are going to buy a mainstream platform for 24/7 rendering, makes no sense.

So this is about to happen, people spending loads on new fancy DDR5 just to have worse performance, because the timings are not quit there yet. And if you think about "long term", then waiting for Raptor Lake/Zen 4 would be way better option, as by that time DDR5 will for sure be faster.

People will have a lot of surprises in 2 days when the reviews drop... quick tip, look up for the lower end boards with DDR4... or just skip this gen. Because, oh boy, that 12900k with dual rank DDR4 at 3866 C14 in Gear 1, will completly obliterate any DDR5 config...

Warzone is trash and should not be used as a benchmark.
 
Joined
Jan 27, 2015
Messages
1,650 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
50 ring bro... framechasers and their OCs. :kookoo:

I can't imagine a world where that thing doesn't just randomly lock up.

Gen 1 DDR5 vs the fastest DDR4 is not going to win... that's like comparing DDR4 2133 CL15 to the fastest DDR3 kits - it will get smoked. It's going to be at least few months until worthwhile DDR5 kits are available.

Watching that guy basically got me back into tweaking and such. I think he's going to like AL, a lot of reports of big overclock ability.

I really don't know on DDR5vsDDR4. I was around for DDR3->DDR4 and earlier transitions and yeah that's how it's always been. But for someone on say DDR4-3200 CL16 or some such, baseline DDR4-4800 (JEDEC) may very well be faster. We didn't get that heavy a MT/s bump when we went from DDR3-1600 to DDR4-1866/2133. I mean technically there is a DDR3-2133 JEDEC standard.

ofc the assumption here is that one is ok with spending an extra 100-200 on RAM if they are going DDR5 too.
 
D

Deleted member 215115

Guest
Just to illustrate something regarding latency - notice how bad latency is on this Trident Z Royal DDR4 is, despite CL 17 :

But notice how well it performs in gaming :

That Sandra memory benchmark seems to be really random. I would use their Multicore benchmark instead, but hey, this is Tom's hardware. Can't expect that much.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.60/day)
Joined
Jan 24, 2020
Messages
107 (0.07/day)
Warzone is trash and should not be used as a benchmark.

It should, considering it scales with basically everything.... and is also one of the most played games right now in the world. And one that makes overclocking and tuning actually FUN and USEFUL. Something that isn´t useful anymore in other games like overwatch, cs, dota, League, where any decent 3 year 6 core will max up all of them.

So yeah, I would say Warzone should really be used as a benchmark.

Plus, that´s not the only game where 10900k completly smokes any other chip. Idk if you tried Halo Infinite Multiplayer Beta, but 10900k again dominated the charts and 5950x stood no chance. Won´t even mention 5800x or 5600x. Same happens in Escape From Tarkov or literally any other big map/world online game. 10900k is too good, it will be a 10 year chip just like sandy bridge. 10 cores on a single die and ability to use very fast ram in gear 1 is too good, no competition.

I am sure Alder Lake will beat 10900k on every console Port single player game or stuff like that, just like Zen 3 does. But in massive multiplayer games where latency is everything we all know 10900k will keep dominating. No chance for Alder Lake, not with this memory config.
 
Joined
Jan 27, 2015
Messages
1,650 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
It should, considering it scales with basically everything.... and is also one of the most played games right now in the world. And one that makes overclocking and tuning actually FUN and USEFUL. Something that isn´t useful anymore in other games like overwatch, cs, dota, League, where any decent 3 year 6 core will max up all of them.

So yeah, I would say Warzone should really be used as a benchmark.

Plus, that´s not the only game where 10900k completly smokes any other chip. Idk if you tried Halo Infinite Multiplayer Beta, but 10900k again dominated the charts and 5950x stood no chance. Won´t even mention 5800x or 5600x. Same happens in Escape From Tarkov or literally any other big map/world online game. 10900k is too good, it will be a 10 year chip just like sandy bridge. 10 cores on a single die and ability to use very fast ram in gear 1 is too good, no competition.

I am sure Alder Lake will beat 10900k on every console Port single player game or stuff like that, just like Zen 3 does. But in massive multiplayer games where latency is everything we all know 10900k will keep dominating. No chance for Alder Lake, not with this memory config.

I have a feeling you're going to be right on that. I think there are some scenarios where the specific instructions being executed are running so fast on every modern CPU, that the only thing that matters is getting the data to be executed on into the pipe as quickly as possible. OTOH, we do have Alder Lake DDR4 boards and they are not all low end.

Three more hours...
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.55/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
D

Deleted member 215115

Guest
It should, considering it scales with basically everything.... and is also one of the most played games right now in the world. And one that makes overclocking and tuning actually FUN and USEFUL. Something that isn´t useful anymore in other games like overwatch, cs, dota, League, where any decent 3 year 6 core will max up all of them.

So yeah, I would say Warzone should really be used as a benchmark.

Plus, that´s not the only game where 10900k completly smokes any other chip. Idk if you tried Halo Infinite Multiplayer Beta, but 10900k again dominated the charts and 5950x stood no chance. Won´t even mention 5800x or 5600x. Same happens in Escape From Tarkov or literally any other big map/world online game. 10900k is too good, it will be a 10 year chip just like sandy bridge. 10 cores on a single die and ability to use very fast ram in gear 1 is too good, no competition.

I am sure Alder Lake will beat 10900k on every console Port single player game or stuff like that, just like Zen 3 does. But in massive multiplayer games where latency is everything we all know 10900k will keep dominating. No chance for Alder Lake, not with this memory config.
The game has many issues due to its engine and activision always manages to screw something up with every patch. Plus it's multiplayer like the other games you mentioned so :sleep: boring and really uninteresting. If that's where Intel chooses to dominate, I say great. Take all the trash games and we'll keep the rest.

I'm really excited for Alder Lake though.
 
Joined
Jan 24, 2020
Messages
107 (0.07/day)
The game has many issues due to its engine and activision always manages to screw something up with every patch. Plus it's multiplayer like the other games you mentioned so :sleep: boring and really uninteresting. If that's where Intel chooses to dominate, I say great. Take all the trash games and we'll keep the rest.

I'm really excited for Alder Lake though.

So where do you want the frames? I mean, I use a 240hz monitor and can´t use anything else for a multiplayer game. Give me frames. They provide lower response time, easier to aim, more fun, smoother!

But for freaking Tomb Raider or Day´s Gone? Dude I literally lock all of that stuff at 60fps, try to use as much resolution and detail as possible and just have fun. Single Player gamers are the happiest ones. They can literallly get an i3 10100F, pair it with baseline 3200mhz ram and run every game at 60fps. For 80€, passive or 400rpm inaudible cooling and 30 watts while gaming :|
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,272 (2.28/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Joined
Sep 15, 2011
Messages
6,505 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Sorry, but even with those high speeds and bandwidth, the latencies are still crap.
And this is proven by using a proper calculator like this one:

On topic, those sticks should be CL32 at least, while ultra-low latency should be called anything bellow that value.

Anything above 10ns is junk, while good RAMs are the ones within 9nm or lower latencies.
Guess we need to wait for the 13rd Intel Core generation until we can get those.... Sadly.
 
Last edited:
Joined
Jan 29, 2021
Messages
1,763 (1.45/day)
Location
Alaska USA
It should, considering it scales with basically everything.... and is also one of the most played games right now in the world. And one that makes overclocking and tuning actually FUN and USEFUL. Something that isn´t useful anymore in other games like overwatch, cs, dota, League, where any decent 3 year 6 core will max up all of them.

So yeah, I would say Warzone should really be used as a benchmark.

Plus, that´s not the only game where 10900k completly smokes any other chip. Idk if you tried Halo Infinite Multiplayer Beta, but 10900k again dominated the charts and 5950x stood no chance. Won´t even mention 5800x or 5600x. Same happens in Escape From Tarkov or literally any other big map/world online game. 10900k is too good, it will be a 10 year chip just like sandy bridge. 10 cores on a single die and ability to use very fast ram in gear 1 is too good, no competition.

I am sure Alder Lake will beat 10900k on every console Port single player game or stuff like that, just like Zen 3 does. But in massive multiplayer games where latency is everything we all know 10900k will keep dominating. No chance for Alder Lake, not with this memory config.
Alder Lake w/DDR4 seems to perform well in games.

 
Joined
Sep 17, 2019
Messages
452 (0.26/day)
(CAS latency/RAM clock speed) x 2000 = latency in nanoseconds

DDR5 7000 CL40

40/7000 *2000 = 11.4ns

DDR4-3600 CL16 (good stuff)

16/3600 * 2000 = 8.9ns

DDR4-3200 CL18 (common stuff)

18/3200 * 2000 = 11.25


This is actually getting there on latency, a whole lot faster than what happened with DDR4 vs DDR3. Friday NDA lift will be an interesting day.
My PC 3200 timings are 14-14-14-34 = 14/3200 * 2000 = 8.75ns Ram is G.Skill Flare X
My PC3600 timings are 14-14-14-34 = 14/3600 * 2000 = 7.78ns Ram is OLOY Blade PC4000 down clocked to PC 3600.

OLOY PC4000 timings are 15-15-15-36 14/4000 * 200 = 7.5ns

You could see why I clocked down. It was to use less voltage and pretty much get the same performance. This is also why I generally get the best RAM if I can get it on sale. Because my G.Skill is still a viable option for my Ryzen CPU's.
 
Top