• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4090 PCI-Express Scaling

gQx

Joined
Jul 1, 2018
Messages
85 (0.04/day)
I wonder since you have this fps chart what about the temperature and power consumption difference between those cards?
 
Joined
Jul 14, 2018
Messages
413 (0.20/day)
Location
Jakarta, Indonesia
System Name PC-GX1
Processor i9 10900 non K (stock) TDP 65w
Motherboard asrock b560 steel legend | Realtek ALC897
Cooling cooler master hyper 2x12 LED turbo argb | 5x12cm fan rgb intake | 3x12cm fan rgb exhaust
Memory corsair vengeance LPX 2x32gb ddr4 3600mhz
Video Card(s) MSI RTX 3080 10GB Gaming Z Trio LHR TDP 370w| 551.76 WHQL | MSI AB v4.65 | RTSS v7.35
Storage NVME 2+2TB gen3| SSD 4TB sata3 | 1+2TB 7200rpm sata3| 4+4+5TB USB3 (optional)
Display(s) AOC U34P2C (IPS panel, 3440x1440 75hz) + speaker 5W*2 | APC BX1100CI MS (660w)
Case lianli lancool 2 mesh RGB windows - white edition | 1x dvd-RW usb 3.0 (optional)
Audio Device(s) Nakamichi soundstation8w 2.1 100W RMS | Simbadda CST 9000N+ 2.1 352W RMS
Power Supply seasonic focus gx-850w 80+ gold - white edition 2021 | APC BX2200MI MS (1200w)
Mouse steelseries sensei ten | logitech g440
Keyboard steelseries apex 5 | steelseries QCK prism cloth XL | steelseries arctis 5
VR HMD -
Software dvd win 10 home 64bit oem + full update 22H2
Benchmark Scores -
OMG, thank u so much, dude..... finally, my old PCIE 3.0 16x still can handle those rtx 4090 24gb...... the different is only 2-3%..... cool......
 
Joined
Aug 25, 2021
Messages
998 (1.02/day)
no doubt there will be a string of buyers who will benefit from the uncompromising 5.0 flex, more-so specific workstation-class builds but for the wider majority (gamers/general productivety/office work/etc) i can't see most of us needing anything above 3.0 and yet 4.0 is a probable long term blessing in itself.

For most by the time 5.0 becomes an essential piece of the puzzle current platforms will be, from an overall performance enthusiast perspective, ANCIENT! Saying that, as a gamer i'm still keeping my options open and might end up pulling the trigger on Zen 4... not necessarily for PCIe 5.0 but socket longevity.
Be mindful that several PCIe standards co-exist on same boards. You still have Gen3, Gen4 and Gen5 lanes an AM5 and 700 chipsets in different proportions.

The proportion of devices wired with different speeds will decide which platform appeals to your needs and budget. Diversity is the key here. We need most advanced, current and older PCIe lanes for variety of needs and peripherals. For example, it would be a waste to wire vast majority of LAN ports with Gen4 lanes, unless it's 10 GbE. It would also be a waste to wire entire motherboard with Gen5 lanes only, as noone really needs that.

Agreed. PCIe 5.0 isn't worth the apparent extra cost (on motherboards) to the average user. Heck, to most users, the only practical difference between PCIe 3.0 and PCIE 4.0 SSDs is that the latter run hotter. No lie, I actually bought a secondary PCIe 4.0 SSD by accident recently, because the price was low enough that I assumed it was a 3.0 model. Learned otherwise when I saw the temperature after installing it, though, lol.

Not to say that there isn't merit in pushing these newer technologies to the consumer space, but I just don't have any compelling reason to care about what they offer. W1zzard's excellent testing here reinforces the point.
There is a merit, as explained above. New technologies are needed. User base is very diverse. If you do not have a reason to care personally, that's fine. But, you should at least be open-minded to new technologies coming to consumer space for those who want to use it for their needs. You and other users coexist in the same consumer space.
 
Joined
Apr 1, 2011
Messages
151 (0.03/day)
so I pre-ordered Raptor-lake , and already gotten myself a 4090.

my question is, should I leave the GEN 5 m.2 slot empty ? or is it fine if I use a GEN4 m.2 on it ? if I do , I would also lose half the lanes?

I do not want to lose 1% or 2%. I will just use the other m.2 slots , I just wish if I could use all the slots on my board. Thanks
Look at the block diagram in the motherboard manual, usually it's in the front pages of the manual. See if the m.2 connects direct to the CPU or direct to the chipset. If the diagram's line leading to the m.2 slot doesn't merge with any PCIe slots or m.2 slots then it is dedicated to that m.2 slot. If it connects to the chipset check the PCIe speed between the chipset and the CPU in the block diagram also. Also be prepared to purchase a m.2 heatpipe cooler if you get a 13GB/s PCIe 5 SSD because they get extremely hot. Make sure the location of that m.2 heatpipe does not interfere with the GPU heatsink or with a CPU heatsink if you decide on CPU air cooling. Also keep in mind that PCIe 5.0 m.2 are longer wide 25mm x 110mm instead of common 22mm x 80mm used today.
 
Last edited:
Joined
Aug 25, 2021
Messages
998 (1.02/day)
so I pre-ordered Raptor-lake , and already gotten myself a 4090.
my question is, should I leave the GEN 5 m.2 slot empty ? or is it fine if I use a GEN4 m.2 on it ? if I do , I would also lose half the lanes?
I do not want to lose 1% or 2%. I will just use the other m.2 slots , I just wish if I could use all the slots on my board. Thanks
If you use any dedicated M.2 Gen5 slot with any NVMe drive on any Z790 board supporting Gen5 NVMe drives, your GPU slot will operate at x8 Gen5 with upcoming AMD GPUs and x8 Gen4 with Nvidia and Intel cards. This is because CPU provides only x16 Gen5 lanes for GPU and no additional dedicated Gen5 lanes for M.2 drives.

Motherboard vendors workaround is to "steal" x8 lanes from GPU in order to provide M.2 Gen5 capability. This means whichever NVMe drive you put into dedicated M.2 Gen5 slots, it will cut GPU slot lanes in half. You will need to check in manual or in BIOS whether M.2 Gen5 slots operate by default or the option needs to be toggled and enabled in BIOS. Different vendors may enable it by default or not in BIOS.

If you do not want to loose 1-2% on GPU, then run bandwidth and scaling tests, such as PCIe bandwidth speed in 3D Mark and a few games for scaling.
1. Test GPU in x16 Gen4 mode (Nvidia 4000 cards do not support Gen5) and without NVMe drive in Gen5 slot
2. Test GPU in x8 Gen4 mode with any NVMe drive in M.2 Gen 5 dedicated slot.
See if there is any difference in data throughput and fps, and let us know. I am quite curious to know.

If there is no difference, you are ok to put any NVMe drive into M.2 Gen5 dedicated slot. If there is 1-2% difference, it's negligible, practically zero. You will never notice this in using 4090. You can still leave your NVMe drive in that slot. If difference is 5% or more, then leave all M.2 Gen5 dedicated slots empty.

Below is a diagram of Z790 CPU and HSIO lanes. I hope this answers your questions.
z790.JPG



Look at the block diagram in the motherboard manual, usually it's in the front pages of the manual. See if the m.2 connects direct to the CPU or direct to the chipset. If the diagram's line leading to the m.2 slot doesn't merge with any PCIe slots or m.2 slots then it is dedicated to that m.2 slot. If it connects to the chipset check the PCIe speed between the chipset and the CPU in the block diagram also. Also be prepared to purchase a m.2 heatpipe cooler if you get a 13GB/s PCIe 5 SSD because they get extremely hot. Make sure the location of that m.2 heatpipe does not interfere with the GPU heatsink or with a CPU heatsink if you decide on CPU air cooling. Also keep in mind that PCIe 5.0 m.2 are longer wide 25mm x 110mm instead of common 22mm x 80mm used today.
You did not answer his question. There is no M.2 Gen5 slot connected to chipset. CPU supports only x16 Gen5 lanes for GPU, which could be bifurcated into x8 Gen5 for GPU and x8 Gen5 for NVMe drives. Intel's chipset operates at Gen4 speeds for all peripherals.
 
Last edited:
  • Like
Reactions: HTC
Joined
Apr 1, 2011
Messages
151 (0.03/day)
You did not answer his question. There is no M.2 Gen5 slot connected to chipset. CPU supports only x16 Gen5 lanes for GPU, which could be bifurcated into x8 Gen5 for GPU and x8 Gen5 for NVMe drives. Intel's chipset operates at Gen4 speeds for all peripherals.
Wow I didn't realize PCIe 5.0 support was so limited. I figured even on the budget boards it would support at least one PCIe 5 m.2 without stealing from the GPU.
 
Last edited:
Joined
Jan 14, 2019
Messages
9,881 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
So when limited to pci-e gen 3.0 or 4.0 x8, the $1600 4090 is slower in Metro: Exodus than the $900 6950 XT. Interesting. Still over 200 FPS at 1080p, but still interesting.
 
Joined
Aug 25, 2021
Messages
998 (1.02/day)
Wow I didn't realize PCIe 5.0 support was so limited. I figured even on the budget boards it would support at least one PCIe 5 m.2 without stealing from the GPU.
It's the hard limit on Raptor Lake CPUs. Vendors need to work with x16 Gen5 lanes and make a compromise if they want to enable M.2 Gen5 drives.
For more Gen5 lanes, either from CPU or chipset, consumers who need it will need to wait for Meteor Lake. Or consider Zen4 AM4 platform, where CPUs have x16 Gen5 for GPU and another x8 for Gen5 drives. Below is a diagram of all lanes for Ryzen X670E platform.

AM5.JPG

So when limited to pci-e gen 3.0 or 4.0 x8, the $1600 4090 is slower in Metro: Exodus than the $900 6950 XT. Interesting. Still over 200 FPS at 1080p, but still interesting.
True. People will need to look into specific scenarios. On average, difference is small, but in some games situation is different.
 
Last edited:
  • Like
Reactions: HTC

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.78/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
It's the hard limit on Raptor Lake CPUs. Vendors need to work with x16 Gen5 lanes and make a compromise if they want to enable M.2 Gen5 drives.
For more Gen5 lanes, either from CPU or chipset, consumers who need it will need to wait for Meteor Lake. Or consider Zen4 AM4 platform, where CPUs have x16 Gen5 for GPU and another x8 for Gen5 drives. Below is a diagram of all lanes for Ryzen X670E platform.

View attachment 265735

True. People will need to look into specific scenarios. On average, difference is small, but in some games situation is different.

Do you happen to have the diagrams for X670 non-E and 650(E)?
 
Joined
Jun 25, 2015
Messages
173 (0.05/day)
Location
The Wasteland
System Name 4K Master
Processor AMD 5950x
Motherboard MSI Creation x570
Cooling Black Ice Nemesis 360GTX+480GTS, Dual Pumps
Memory TridentZ DD4 64GB 3600Mhz
Video Card(s) MSI RTX3090
Storage OS: Optane 900p 480Gb, Games: Samsung 860 EVO 4Tb x3 + 970 EVO Plus 2Tb x2
Display(s) LG OLED 55C9
Case Phantecs Enthoo 719
Audio Device(s) Hyperx Cloud Orbit S
Power Supply Corsair HX1200i + CableMod Full KIt
Mouse Razer Mamba Elite
Keyboard Razer Huntsman Elite v2
VR HMD HP G2 v2 + Quest 2
Software Windows 11Dev Preview
To me, if I am paying $1,600 or more for a GPU, it better have all of the newest bells and whistles. It doesn’t have DP 2.0 and PCIe 5.0, even if it or I can’t use them, I still want them because of how much it cost
Ill be happy if they remove displayport from high end card
IMO useless, give me more HDMI 2.1
Its time to abandon this idiotic dual format and go with single universal display connection, HDMI 2.1 is great and lets anyone with whatever device they have use pc monitor.
DP is limited to just PC monitors.
 
Joined
Aug 25, 2021
Messages
998 (1.02/day)
Do you happen to have the diagrams for X670 non-E and 650(E)?
It's the same for X670, apart from GPU, which is x16 Gen4

Below is B650E. For B650, GPU and NVMe are Gen4.

B650E.JPG


Ill be happy if they remove displayport from high end card
IMO useless, give me more HDMI 2.1
Its time to abandon this idiotic dual format and go with single universal display connection, HDMI 2.1 is great and lets anyone with whatever device they have use pc monitor.
DP is limited to just PC monitors.
DisplayPort IS monitor standard. Nothing will change that. It is in several ways better than HDMI. You can tunnel DP through USB4 and have it in Alt Mode. DP is more flexible than HDMI. Over one single USB-C cable you can have DP, USB, PCIe data and power. It's brilliant. Plus, multi-stream transport. You can daisy-chain monitors over DP. HDMI does not offer that level of flexibility. And DP 2.0 will offer up to 80 Gbps ports. It's better.

I do agree that GPU should have one more HDMI port though. Some vendors offer it. Look up some Gigabyte and Asus cards. Also, HDMI has royalties. There are adapters too.
 
Last edited:
Joined
Jun 25, 2015
Messages
173 (0.05/day)
Location
The Wasteland
System Name 4K Master
Processor AMD 5950x
Motherboard MSI Creation x570
Cooling Black Ice Nemesis 360GTX+480GTS, Dual Pumps
Memory TridentZ DD4 64GB 3600Mhz
Video Card(s) MSI RTX3090
Storage OS: Optane 900p 480Gb, Games: Samsung 860 EVO 4Tb x3 + 970 EVO Plus 2Tb x2
Display(s) LG OLED 55C9
Case Phantecs Enthoo 719
Audio Device(s) Hyperx Cloud Orbit S
Power Supply Corsair HX1200i + CableMod Full KIt
Mouse Razer Mamba Elite
Keyboard Razer Huntsman Elite v2
VR HMD HP G2 v2 + Quest 2
Software Windows 11Dev Preview
So when limited to pci-e gen 3.0 or 4.0 x8, the $1600 4090 is slower in Metro: Exodus than the $900 6950 XT. Interesting. Still over 200 FPS at 1080p, but still interesting.

Why are you looking at 1080p for 1600$ card? Any resolution besides 4K is irrelevant
In 4K, even at PCIe Gen 2.0 x16 [Gen 4.0 x2] mode the 4090 is 20fps faster than 6950

In gen 3.0 x16/Gen 4.0 x8, its 50fps faster.
 
Joined
Jan 14, 2019
Messages
9,881 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
Why are you looking at 1080p for 1600$ card? Any resolution besides 4K is irrelevant
In 4K, even at PCIe Gen 2.0 x16 [Gen 4.0 x2] mode the 4090 is 20fps faster than 6950

In gen 3.0 x16/Gen 4.0 x8, its 50fps faster.
Because that's the resolution that I'm using. I know the 4090 isn't meant for that, just like it isn't meant for me, but that's my point of reference anyway.
 
Joined
Jun 25, 2015
Messages
173 (0.05/day)
Location
The Wasteland
System Name 4K Master
Processor AMD 5950x
Motherboard MSI Creation x570
Cooling Black Ice Nemesis 360GTX+480GTS, Dual Pumps
Memory TridentZ DD4 64GB 3600Mhz
Video Card(s) MSI RTX3090
Storage OS: Optane 900p 480Gb, Games: Samsung 860 EVO 4Tb x3 + 970 EVO Plus 2Tb x2
Display(s) LG OLED 55C9
Case Phantecs Enthoo 719
Audio Device(s) Hyperx Cloud Orbit S
Power Supply Corsair HX1200i + CableMod Full KIt
Mouse Razer Mamba Elite
Keyboard Razer Huntsman Elite v2
VR HMD HP G2 v2 + Quest 2
Software Windows 11Dev Preview
DisplayPort IS monitor standard. Nothing will change that. It is in several ways better than HDMI. You can tunnel DP through USB4 and have it in Alt Mode. DP is more flexible than HDMI. Over one single USB-C cable you can have DP, USB, PCIe data and power. It's brilliant. Plus, multi-stream transport. You can daisy-chain monitors over DP. HDMI does not offer that level of flexibility. And DP 2.0 will offer up to 80 Gbps ports. It's better.

I do agree that GPU should have one more HDMI port though. Some vendors offer it. Look up some Gigabyte and Asus cards. Also, HDMI has royalties. There are adapters too.

Royalties is the main reason why DP is used.
HDMI also has alt mode and can run over USB-C with power
with monitors, very few large models come out so all the speed of DP is wasted.

I been using OLEDs as my monitor, so at least i use the 4K/120 mode and its necessary,
for PC monitor users, and lots of people for some reason by TINY useless monitors like 27 inch [and some people even smaller model there is no need for 4K at that size.
Even 21:9 "35" inch models not yet 4K and they don't need it

What im saying is that monitor users, the majority of them, like over 95% use tiny sizes and they dont go above 1440p, so 80gbps, 8K and so on is not for them.

What happens now is that people that cant afford a TV, cant buy any PC monitor and use it, it will have royalty free DP, or like garbing a cheap PC monitor and connecting to photography equipment which uses HDMI.
If we move to single format, it will solve lots of issue
 
Joined
Aug 25, 2021
Messages
998 (1.02/day)
What happens now is that people that cant afford a TV, cant buy any PC monitor and use it, it will have royalty free DP, or like garbing a cheap PC monitor and connecting to photography equipment which uses HDMI.
If we move to single format, it will solve lots of issue
True for HDMI Alt Mode. I forgot about it. I have checked. And the reason I forgot is because it uses HDMI 1.4b standard, not implemented on many devices in PC industry. I also have OLED TV, but do not use it as monitor. It has its own quirks. Firstly, it's too big, then text, etc. I do use it for 4K/120 gaming sometimes.

I do not think DP speed is wasted. For monitors that need up to 21 Gbps of bandwidth, vendors install DP 1.2 ports. Above that, you get DP 1.4, for 4K and some HDR monitors. It's quite rational decision.

There are billions of monitors at home, at work, at public places, of very different sizes, from small laptop displays to professional 32/40 inch, to gigantic public displays over 100 inch. It's fine. 80 Gbps ports will find their use. There is no doubt about it. Remember, with one 80 Gbps cable you can daisy chain and simplify cabling in workplace, for example. Super convenient.

There are simple adaptors for all connectors. It's not a big deal. Increasing number of audio, video and storage devices are moving to USB-C. This is the one to rule them all, as you can flexibly choose which data protocols you want to offer over it, e.g. Thunderbolt, DP, PCIe, USB and power.
 
Joined
Mar 6, 2012
Messages
566 (0.13/day)
Processor i5 4670K - @ 4.8GHZ core
Motherboard MSI Z87 G43
Cooling Thermalright Ultra-120 *(Modded to fit on this motherboard)
Memory 16GB 2400MHZ
Video Card(s) HD7970 GHZ edition Sapphire
Storage Samsung 120GB 850 EVO & 4X 2TB HDD (Seagate)
Display(s) 42" Panasonice LED TV @120Hz
Case Corsair 200R
Audio Device(s) Xfi Xtreme Music with Hyper X Core
Power Supply Cooler Master 700 Watts
Control feels like a very bad coded game, graphic looks like they are from 2012 but performance requirement are way to high.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Joined
Jan 14, 2019
Messages
9,881 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
Joined
Mar 6, 2012
Messages
566 (0.13/day)
Processor i5 4670K - @ 4.8GHZ core
Motherboard MSI Z87 G43
Cooling Thermalright Ultra-120 *(Modded to fit on this motherboard)
Memory 16GB 2400MHZ
Video Card(s) HD7970 GHZ edition Sapphire
Storage Samsung 120GB 850 EVO & 4X 2TB HDD (Seagate)
Display(s) 42" Panasonice LED TV @120Hz
Case Corsair 200R
Audio Device(s) Xfi Xtreme Music with Hyper X Core
Power Supply Cooler Master 700 Watts

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
CPU limited, and the higher CPU overhead of Ada causes it to fall behind older gen while CPU limited, because the driver eats a few extra CPU cycles that could otherwise go to the game to eke out of a few more frames
That's something i've never seen mentioned before that the 40 series *need* a faster CPU
 
Joined
Jan 14, 2019
Messages
9,881 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
Joined
Jan 2, 2008
Messages
3,296 (0.55/day)
System Name Thakk
Processor i7 6700k @ 4.5Ghz
Motherboard Gigabyte G1 Z170N ITX
Cooling H55 AIO
Memory 32GB DDR4 3100 c16
Video Card(s) Zotac RTX3080 Trinity
Storage Corsair Force GT 120GB SSD / Intel 250GB SSD / Samsung Pro 512 SSD / 3TB Seagate SV32
Display(s) Acer Predator X34 100hz IPS Gsync / HTC Vive
Case QBX
Audio Device(s) Realtek ALC1150 > Creative Gigaworks T40 > AKG Q701
Power Supply Corsair SF600
Mouse Logitech G900
Keyboard Ducky Shine TKL MX Blue + Vortex PBT Doubleshots
Software Windows 10 64bit
Benchmark Scores http://www.3dmark.com/fs/12108888
I've been thinking about make an UE5-based workload for various testing, but I feel it's too early for a rendering workloads and it might misrepresent what UE5 can do eventually...
Ahh yes.. true. Nevertheless, good review as always!
 

Hello_im_snowman

New Member
Joined
Oct 19, 2022
Messages
3 (0.01/day)

Absolutely brilliant article that has answered every question and curiosity I had about the 4090's PCIe scaling. I made an account just to upvote the content, well done!
 

sizzling

New Member
Joined
Oct 21, 2022
Messages
1 (0.00/day)
I could not see it confirmed, did the pcie 4.0 results have resizable bar enabled? Seeing as it’s only supported on 4.0 it’s a key factor for understanding the difference between 3.0 and 4.0.
 
Joined
Dec 16, 2010
Messages
1,662 (0.34/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, 2 x 512GB Samsung PM981a, 4 x 4TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
Below is a diagram of Z790 CPU and HSIO lanes. I hope this answers your questions.

View attachment 265727
I'm not sure whether you generated that image or you got it from somewhere else, but according to Intel's ARK and other Intel documentation the Alder Lake/Raptor Lake CPU only supports bifurcating the PCIe x16 into two x8 slots. There is no evidence of x8/x4/x4 like was supported on Z590 and before. So there can only be one PCIe 5.0 M.2 slot on an Intel CPU. The other 4 lanes would go unused in that case.

Also, Intel says "Up to 20 PCIe 4.0" and "Up to 8 PCIe 3.0" for Z790, but your diagram would be 18 and 10, respectively. Two of the PCIE 3.0 lanes in your diagram should be PCIe 4.0.
 
Top