• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

16x 8x Lanes, NVMe, CPU/SB link

Joined
Mar 24, 2010
Messages
5,047 (1.39/day)
Location
Iberian Peninsula
Hi there,
I recently made a PC with the Z390.
For SSD, it offers 2 M.2 PCI-E slots next to (under) the southbridge cover and 2 PCI-E slots next to the CPU (with the Asus' DIMM.2 invent card)
Apart I got the GPU.
BIOS offers some criptic options I won't comment here but in short it is this:
My 2080 TI GPU runs at 8x speed now because I've two SDD's on the CPU bound DIMM'2 card.
I presume that this way the SDD's run at optimal speed (CPU linked) but my GPU at 8x.

If I move both SDD's to the chipset ports under the southbridge cover (hotter), will GPU speed change from 8x to 16x, but SDD's speed be worse?
Common sense answers needed, I have little time to do full benchmarks with all the options. *I also could put 1 SDD in the DIMM.2 and 1 SSD on the SB...}
Thanks ahead!
 
Joined
Nov 11, 2004
Messages
4,853 (0.87/day)
Location
Formosa
System Name Overlord Mk MXVI
Processor AMD Ryzen 7 3800X
Motherboard Gigabyte X570 Aorus Master
Cooling Corsair H115i Pro
Memory 32GB Viper Steel 3600 DDR4 @ 3800MHz 16-19-16-19-36
Video Card(s) Gigabyte RTX 2080 Gaming OC 8G
Storage 1TB WD Black NVMe (2018), 2TB Viper VPN100, 1TB WD Blue 3D NAND
Display(s) Asus PG27AQ
Case Corsair Carbide 275Q
Audio Device(s) Corsair Virtuoso SE
Power Supply Corsair RM750
Mouse Logitech G500s
Keyboard Wooting Two
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/ztiub6
Yes, the graphics card would run at x16 if you did that, as the other two slots would be connected to the chipset.
If you do that, you have to shave the DMI 3.0 interface (which is technically x4 PCIe lanes) with everything else that connects through the chipset.
In most instances, this might not matter too much and in everyday use, you're most likely not going to see any performance issues.

Using the DIMM.2 at all, would reduce the bandwidth to the GPU to x8.
 
Joined
Oct 18, 2013
Messages
1,648 (0.71/day)
Location
Somewhere, nowhere, anywhere but HERE !
System Name The Big Black Smoked One
Processor i7-9700k, oc'd to 4.9 ghz
Motherboard Asus ROG Strix-E Gaming
Cooling CoolerMaster AIO 280mm push/pull, 8x Corsair 140mm ML RGB Fans (Front, top, sides, rear)
Memory 32GB Corsair Vengence RBG pro DDR4-3200 (4x 8GB) in XMP 2x
Video Card(s) Zotac GTX 2080Ti w/15% o/c
Storage 2x WD Black SN750 1TB nvme + 2x 2TB Crucial SSD
Display(s) 2x Samsung 43" LCD's @4k-1440p
Case Thermaltake Level 20 XT
Audio Device(s) Built-in
Power Supply EVGA G2 SuperNova 850W Modular
Mouse DAS Professional mechanical
Keyboard Logitech MX Master 2
Software Windows 10 pro 64 bit, with all the unnecessary background shiite turned OFF !
Benchmark Scores Quicker than flies on a dung pile
If you need max GPU performance, put it in the 16x slot and move the m.2's to the mobo...

If you need max drive performance, reverse the above and put the GPU in an 8x slot....
 
Last edited:
Joined
Mar 24, 2010
Messages
5,047 (1.39/day)
Location
Iberian Peninsula
I am doing some GPU PCI-E saturation benchmarks (3D Mark PCI-E feature Test), but apparently, NO-ONE in the world - as per 3D marks results comparison- has done this test, lol (with my setup)

Walking in the dark regarding x8/x16, people say 2080TI is reaching PCI-E 3.0 limits... no idea if this is urban legend.

Not sure if I need max GPU bandwidth or SDD bandwidth... problem of gaming with work PC... :)
Thanks guys!

The DIMM.2 card is sooo cute it decided my purchase, but on X299 platform it was fully supported. on z390 only halfassed.

As a sidenote, the z390 with the 5000Mhz 9900KS is much more responsive in Windows general use than the x299 with the i9 7900X. but less lanes...
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
26,839 (5.16/day)
Location
Indiana, USA
Processor Intel Core i7 9900K@5.0GHz
Motherboard AsRock Z370 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB Corsair DDR4-3000
Video Card(s) ASUS Strix GTX 1080Ti
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I wouldn't even bother using the DIMM.2 off the CPU. You'll never notice the difference between the NVMe drive running off the CPU or chipset.
 
Joined
Dec 31, 2009
Messages
16,601 (4.49/day)
I wouldn't even bother using the DIMM.2 off the CPU. You'll never notice the difference between the NVMe drive running off the CPU or chipset.
But if you are pounding on two drives through the dmi....you are limited to 3.0 x4...one drive can saturate that.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
26,839 (5.16/day)
Location
Indiana, USA
Processor Intel Core i7 9900K@5.0GHz
Motherboard AsRock Z370 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB Corsair DDR4-3000
Video Card(s) ASUS Strix GTX 1080Ti
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
But if you are pounding on two drives through the dmi....you are limited to 3.0 x4...one drive can saturate that.
It depends on how you're pounding on the drives. Not all the data being transferred to the drives is going through the link between the CPU and chipset. That's how DMA works, if you're transferring data from one drive to the next, or from the network to the drives(but then even 10Gb/s isn't going to saturate even one drive but whatever), the data flows directly from the source to the destination with very little going over the DMI link between the CPU and chipset.

The only time you're going to see the DMI link become an limitation is if you are transferring data directly from memory to the drives. Like if you are exporting media in Adobe Premier, where it puts everything in memory(as long as you have enough), encodes it in memory, then transfers it all out of memory in one bulk transfer once the encoding is done. But that's kind of a rare situation, and most people that use Premier queue their exports into Adobe Media Encoder to avoid the extremely high memory usage caused by exporting directly in Premier.

Most of the other times you'd be doing heavy data transfer from the SSDs to the CPU, it is because you're doing some kind of render on the CPU and the render is pulling data from the drives. But even then, the CPU is almost certainly going to be the limiting factor on how fast the render can go, and the amount of data it is requesting is not going to saturate the DMI link.
 
Last edited:
Joined
Mar 24, 2010
Messages
5,047 (1.39/day)
Location
Iberian Peninsula
Very interesting comentaries! Thanks.
1 SDD is windows+apps, and 1 is games.

I will try to run some benchies to add drama ;)
 
Joined
Mar 24, 2010
Messages
5,047 (1.39/day)
Location
Iberian Peninsula
Well well well!
A shame that 3D Mark doesn't keep track of this PCI-E bench.
And it doesn't save them under any tag.

But I can tell you: HUGE DIFFERENCE IN FAVOUR OF 16X!!!!

When I ran it at 8X, the FPS were around 7 fps and the result 6,5 GB/s.

Now at 16X the results are below, beside that the viewing experience was a bit better, around 13 fps and the bandwidth the double! 13,19 GB/s!
:peace:


1576008416886.png
 
Joined
Dec 31, 2009
Messages
16,601 (4.49/day)
When I ran it at 8X, the FPS were around 7 fps and the result 6,5 GB/s.

Now at 16X the results are below, beside that the viewing experience was a bit better, around 13 fps and the bandwidth the double! 13,19 GB/s!
Cool beans. You understand this doesn't translate to GPU performance, right?

Here, let me get that link I mentioned earlier........
 
Joined
Apr 24, 2012
Messages
1,343 (0.47/day)
Location
Northamptonshire, UK
System Name Main/VR/HTPC
Processor 9900K@4.7/Ryzen 7 2700/i3 4360@3.7
Motherboard Gigabyte Z390 Aorus Pro WiFi/Aorus B450I Pro Wifi/MSI H81I
Cooling Custom water/Wraith Spire/Antec C40
Memory Corsair LPX 2x8 3200MHz/HyperX Predator 2x8GB 3200MHz/Crucial Balistix Tactical 8GB
Video Card(s) MSI RTX 2080 Super Ventus /MSI RTX 2070 Super Ventus/Intel HD4600
Storage Samsung 960 EVO 250GB/840 EVO 120GB/OCZ ARC 100 120 GB
Display(s) Acer Z301c/39" Panasonic HDTV
Case Fractal Design Arc XL/Cougar QBX/Thermaltake Core V1
Audio Device(s) Yamaha RX-V379/Nvidia HD Audio/ Intel HD Audio
Power Supply Corsair RM850/ Corsair TX650/350W OEM
Mouse Logitech G900
Keyboard Logitech G610 Orion Red
Software Win 10 Pro 64Bit
What's the motherboard?
At worst, you might lose some SATA ports if you use both M.2 slots. That's what mine does at least.
 
Joined
Mar 24, 2010
Messages
5,047 (1.39/day)
Location
Iberian Peninsula
@EarthDog thanks great article from @W1zzard Yet I'd like to know and test some more :)
Games, drivers, etc change after months... And freaks like us still will keep tweaking even if it's just for the experience, or?
@gdallsk In my case, the sata ports (5 and6) are shared with PCI-E number 3. I had six drives but I killed one with coolant...... just like my GFX1080 :-(
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
26,839 (5.16/day)
Location
Indiana, USA
Processor Intel Core i7 9900K@5.0GHz
Motherboard AsRock Z370 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB Corsair DDR4-3000
Video Card(s) ASUS Strix GTX 1080Ti
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Cool beans. You understand this doesn't translate to GPU performance, right?
I was just going to say this. Of course there will be a difference in the benchmark, it's designed to show PCI-E bandwidth differences. Artificial tests don't translate to real world performance, just like most storage synthetic benchmarks don't translate to real world performance.
 
Joined
Mar 24, 2010
Messages
5,047 (1.39/day)
Location
Iberian Peninsula
I will test some more at 8X and 16X, can't harm! Now we have Metro Exodus, CODMW... just need to find how to, lol.
 
Joined
Dec 31, 2009
Messages
16,601 (4.49/day)
I will test some more at 8X and 16X, can't harm! Now we have Metro Exodus, CODMW... just need to find how to, lol.
It's only time... your time (ty!). :p

Metro Exodus has a built in benchmark... use it (though it is only 90s long). COD MW... not sure if it has one and MP testing is borderline useless due to the inherent variability of MP.
 
Joined
Apr 24, 2012
Messages
1,343 (0.47/day)
Location
Northamptonshire, UK
System Name Main/VR/HTPC
Processor 9900K@4.7/Ryzen 7 2700/i3 4360@3.7
Motherboard Gigabyte Z390 Aorus Pro WiFi/Aorus B450I Pro Wifi/MSI H81I
Cooling Custom water/Wraith Spire/Antec C40
Memory Corsair LPX 2x8 3200MHz/HyperX Predator 2x8GB 3200MHz/Crucial Balistix Tactical 8GB
Video Card(s) MSI RTX 2080 Super Ventus /MSI RTX 2070 Super Ventus/Intel HD4600
Storage Samsung 960 EVO 250GB/840 EVO 120GB/OCZ ARC 100 120 GB
Display(s) Acer Z301c/39" Panasonic HDTV
Case Fractal Design Arc XL/Cougar QBX/Thermaltake Core V1
Audio Device(s) Yamaha RX-V379/Nvidia HD Audio/ Intel HD Audio
Power Supply Corsair RM850/ Corsair TX650/350W OEM
Mouse Logitech G900
Keyboard Logitech G610 Orion Red
Software Win 10 Pro 64Bit
In my case, the sata ports (5 and6) are shared with PCI-E number 3.
That same pcie slot might be shared with the M.2 slot as well, you still havent told me what motherboard you’re using.
 
Joined
Mar 24, 2010
Messages
5,047 (1.39/day)
Location
Iberian Peninsula
That same pcie slot might be shared with the M.2 slot as well, you still havent told me what motherboard you’re using.
OMG! Maximus XI Extreme. Has lots of slots, but limited by chipset. Gotta update my TPU profile
 
Joined
Apr 24, 2012
Messages
1,343 (0.47/day)
Location
Northamptonshire, UK
System Name Main/VR/HTPC
Processor 9900K@4.7/Ryzen 7 2700/i3 4360@3.7
Motherboard Gigabyte Z390 Aorus Pro WiFi/Aorus B450I Pro Wifi/MSI H81I
Cooling Custom water/Wraith Spire/Antec C40
Memory Corsair LPX 2x8 3200MHz/HyperX Predator 2x8GB 3200MHz/Crucial Balistix Tactical 8GB
Video Card(s) MSI RTX 2080 Super Ventus /MSI RTX 2070 Super Ventus/Intel HD4600
Storage Samsung 960 EVO 250GB/840 EVO 120GB/OCZ ARC 100 120 GB
Display(s) Acer Z301c/39" Panasonic HDTV
Case Fractal Design Arc XL/Cougar QBX/Thermaltake Core V1
Audio Device(s) Yamaha RX-V379/Nvidia HD Audio/ Intel HD Audio
Power Supply Corsair RM850/ Corsair TX650/350W OEM
Mouse Logitech G900
Keyboard Logitech G610 Orion Red
Software Win 10 Pro 64Bit
OMG! Maximus XI Extreme. Has lots of slots, but limited by chipset. Gotta update my TPU profile
Luckily for you, the manual doesnt actually state that its disabling slots or ports, have you actually tried plugging your NVMe drives into those slots?
Something like AIDA64 will tell you what speed theyre running at

1576014883067.png


The DIMM.2 card is working as intended
 
Joined
Mar 24, 2010
Messages
5,047 (1.39/day)
Location
Iberian Peninsula
BIOS, GPU-Z, Aida, tell me the speed. In short, for GPU x16 you do have to not use DIMM.2 and/or PCIE-slot 2.
Specs updated.
Second next step is to bench the SDDs, for now I'm installing some ultra high def games.
cheers there

Annotation 2019-12-11 092655.png


I have repeated these benches at maximum resolution and results are not constante.... tonight I will return to this, now work calls :rolleyes:
 
Last edited:
Top