• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4090 PCI-Express Scaling

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,758 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The new NVIDIA GeForce RTX 4090 is a graphics card powerhouse, but what happens when you run it on a PCI-Express 4.0 x8 bus? In our mini-review we've also tested various PCI-Express 3.0, 2.0 and 1.1 configs to get a feel for how FPS scales with bandwidth.

Show full review
 
Why not use Vulkan on RDR2?
 
So, I am only checking 4K results as that is where you see what is GPU made off.

This article will be important for those deciding to go with AMD x670 or x670E or B650 or b650E. From your foundings, you should just be sure that m2 slot has that PCie Gen5 possibility as the GPU will not be able to use it. Saving maybe 50 bucks on that could be invested in a bit faster GPU.

Funny question: Is Nvidia SLI completely dead and if not, would it work with two 4090?
 
So basically if you still got PCIe 3.0 you're still ok with that and a 4090....

But as @ir_cow mentioned you'd probably have a huge CPU bottleneck if you were still on PCIe 3.0 platform with a 4090....
 
Last edited:
The new NVIDIA GeForce RTX 4090 is a graphics card powerhouse, but what happens when you run it on a PCI-Express 4.0 x8 bus? In our mini-review we've also tested various PCI-Express 3.0, 2.0 and 1.1 configs to get a feel for how FPS scales with bandwidth.
Would performance difference stay the same as shown in games (2-3%), if graphics productivity workloads are tested? We know that most games are not able to saturate x8 Gen4 link, but what happens if graphic designers, architects, engineers use their powerful software on x8 Gen4 link with 4090?
 
That settles the Pci-express conserns if you use Gen5 SSD.
It should be the same for the other upcoming 4080 models also:

RTX4080.png

NVIDIA cancels GeForce RTX 4080 12GB

Hi @W1zzard in your upcoming article featuring RTX 4090 performance with 13900K, are you going to use the newer 522.25 drivers (11/10) that improve DX12 performance?

NVIDIA’s new driver enables “substantial” DirectX12 performance improvements for GeForce RTX GPUs

"Our DirectX 12 optimizations apply to GeForce RTX graphics cards and laptops, though improvements will vary based on your specific system setup, and the game settings used. In our testing, performance increases were found in a wide variety of DirectX 12 games, across all resolutions:

  • Assassin’s Creed Valhalla: up to 24% (1080p)
  • Battlefield 2042: up to 7% (1080p)
  • Borderlands 3: Up to 8% (1080p)
  • Call of Duty: Vanguard: up to 12% (4K)
  • Control: up to 6% (4K)
  • Cyberpunk 2077: up to 20% (1080p)
  • F1Ⓡ 22: up to 17% (4K)
  • Far Cry 6: up to 5% (1440p)
  • Forza Horizon 5: up to 8% (1080P)
  • Horizon Zero Dawn: Complete Edition: up to 8% (4k)
  • Red Dead Redemption 2: up to 7% (1080p)
  • Shadow of the Tomb Raider: up to 5% (1080p)
  • Tom Clancy’s The Division 2: up to 5% (1080p)
  • Watch Dogs: Legion: up to 9% (1440p)"
geforce-rtx-30-series-522-25-whql-game-ready-driver-improvements-1920x1080-1.png
 
Last edited:
Overall it's about the difference I expected.
What surprised me though is the difference between individual games and at different resolution both within the same game and compared to other games, did not expect that big of a variance.
 
Why are you benchmarking GPUs with a 5800X? Shouldn't you be using the best CPUs for benchmarking GPUs? You have three other options for the best gaming CPUs, Alder Lake, Zen 4, and the 5800X3D
 
Why are you benchmarking using a 5800X? Shouldn't you be using the best CPUs for benchmarking GPUs? You have three other options for the best gaming CPUs, Alder Lake, Zen 4, and the 5800X3D

Why are you benchmarking GPUs with a 5800X? Shouldn't you be using the best CPUs for benchmarking GPUs? You have three other options for the best gaming CPUs, Alder Lake, Zen 4, and the 5800X3D
I'm using 13900K

Edit: soon
 
Last edited:
I'm using 13900K

You might need to amend the test setup page... its showing 5800X

actually i would have been happier if it was 5800X hehe... looks like I might end up with an AM4 platform if Zen4/RPL can't be had at a preferred budget (no thanks to NVIDIA for contaminating budget allocation)
 
Hah! PCIe 2.0 still being fine almost 16 years later :)

Sure, if you want that last 5% then you'll need PCIe 4.0 but these articles always prove that the PCIe race really isn't that necessary unless you're already chasing diminishing returns.
 
Hah! PCIe 2.0 still being fine almost 16 years later :)

Sure, if you want that last 5% then you'll need PCIe 4.0 but these articles always prove that the PCIe race really isn't that necessary unless you're already chasing diminishing returns.


its hard to make more of an impact when you're already massively CPU -bound at 1080p - this s why the first time ever, the gap INCREASES with resolution bump to 1440p
 
Last edited:
And I guess mostly AMD fans have trashed NVIDIA for not supporting PCIe 5.0.

Considering there's on average a 2% difference between PCIe 3.0 and 4.0 there would be a 0% performance inrovement from using PCIe 5.0.

DP 2.0 for RTX 4090 - that's relevant for some. PCIe 5.0? The time has yet to come.
 
Last edited:
its hard to make more of an impact when you're already massively CPU -bound at 1080p

Looks like many titles are massively CPU-bound even at 4K too.

I think it's fair to say that the 4090 is so much more GPU than almost any CPU, display, or game can fully use right now - that the entire raison d'etre for the 4090 isn't here yet.

Maybe 240Hz 4K displays will give it a purpose beyond bragging rights.
 
Looks like many titles are massively CPU-bound even at 4K too.

I think it's fair to say that the 4090 is so much more GPU than almost any CPU, display, or game can fully use right now - that the entire raison d'etre for the 4090 isn't here yet.

Maybe 240Hz 4K displays will give it a purpose beyond bragging rights.


disagree man, look at the gap at 1440p versus the 3080:

relative-performance_2560-1440.png


New much more GPU-limited gap once you double the res (now 80% versus 88 at pcie 1.11)

relative-performance_2560-1440.png


it would be nice if we could fix thew CPU limits at 1080p, but at least we can switch to analysis at 1440p.
 
If I were to use a two Gen 4 NVMe SSDs in RAID 0 on the M.2 slots that are wired to the chipset of a high end z790 board like the ASUS ROG Maximus z790 Extreme, would I get the full PCIe x16 for the graphics card and get speeds matching a single Gen 5 NVMe SSD that would have been installed on the Gen 5 slot?
 
Still > PCI 3.0 is pointless I see.
 
The relative performance, noise, perf/w and perf/money graphs are always really awsome at TPU and this is no different. thank you @W1zzard

I would really like to see PCIe 5.0 graphics cards so I know for sure in the future whatever plans I have with the second PCIe x8 slot will definitely not affect graphics card performance. PCIe 5.0 x8 plenty for graphics and x8 for other stuff on platform.
 
Hi @W1zzard in your upcoming article featuring RTX 4090 performance with 13900K, are you going to use the newer 522.25 drivers (11/10) that improve DX12 performance?
Of course.. the press drivers that I used also have these improvements

Why not use Vulkan on RDR2?
On cards with small VRAM it will just crash, with DirectX 12 it will run slower but stable

if graphic designers, architects, engineers use their powerful software
What software is that? The people you listed don't need such a fast GPU as far as I understand, they only need some basic viewport acceleration

If I were to use a two Gen 4 NVMe SSDs in RAID 0 on the M.2 slots that are wired to the chipset of a high end z790 board like the ASUS ROG Maximus z790 Extreme, would I get the full PCIe x16 for the graphics card and get speeds matching a single Gen 5 NVMe SSD that would have been installed on the Gen 5 slot?
Good question, hard to say without data for Gen 5 drives, "10 GB/s" is just sequential which is of almost no consequence in real-life
 
I'm using 13900K

Edit: soon
So you didn’t want to change the benchmark system right before having to change it again for Raptor Lake? that makes sense.

The issue is that testing with the 5800X is disingenuous, not that I would blame you for not wanting to change setups 2 times so quickly.

I do hope you revisit the PCIe scaling and GPU performance when you get the 13900K. It seems the average 4K numbers for this 4090 review are 20-30% below what others got.
 
And I guess mostly AMD fans have trashed NVIDIA for not supporting PCIe 5.0.

Considering there's on average a 2% difference between PCIe 3.0 and 4.0 there would be a 0% performance inrovement from using PCIe 5.0.

DP 2.0 for RTX 4090 - that's relevant for some. PCIe 5.0? The time has yet to come.
To me, if I am paying $1,600 or more for a GPU, it better have all of the newest bells and whistles. It doesn’t have DP 2.0 and PCIe 5.0, even if it or I can’t use them, I still want them because of how much it cost
 
@W1zzard Wouldn't the most correct test methodology be pairing the 4090 with a 7700X?
The 5800X cannot extract the full potential of this GPU.
 
Back
Top