• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Will performance decrease on ASUS RTX 4070 TI Super PCIE 4.0 X8 Bus?

Joined
Apr 16, 2022
Messages
88 (0.08/day)
Processor AMD Ryzen 9 7900X3D
Motherboard ASUS ROG Crosshair X670E Hero
Cooling ASUS ROG Strix LC III 360
Memory G.Skill 48GB(2x24) TZ5 Neo RGB EXPO 6400mhz CL32
Video Card(s) ASUS TUF RTX4070TI SUPER
Storage Adata XPG SX8200 Pro 2X2TB, Adata XPG SX8100 3X2TB,
Display(s) Dell 34" Curved Gaming Monitor S3422DWG
Case Corsair 5000X
Power Supply Corsair RM1000x SHIFT 80 PLUS Gold
Mouse Asus ROG Gladius II Core
Keyboard Asus ROG Strix Scope
Hello everyone;

AMD Ryzen 9 7900X3D
Asus ROG Strix B650E-E Gaming
and Asus RTX TUF RTX4070TI Super,
I have a system consisting of main components.

There are a total of 4 Nvme M2 slots on my motherboard. 2 of them support PCIE 5.0, and when you insert an SSD into the second m2 slot, the GPU slot decreases from X16 to X8 (Pcie Gen 5). How much will I lose in performance as a result of decreasing this bus from X16 to X8?
I want to learn this.
My graphics card supports PCIE Gen4. The motherboard is Gen 5.
But when the bus decreases from X16 to X8, this also corresponds to the same bus width as X16 Gen3.
Most graphics cards do not use X16 Gen4 bandwidth. I know that.
I know that even Gen3 16x is more than enough.
But I would still be happy if those who know could respond.
 
On average it's 2~3% on a 4090, and it's gonna be less on a weaker GPU.

 
On average it's 2~3% on a 4090, and it's gonna be less on a weaker GPU.


Then dropping to x8 won't affect overall performance too much.

I'm looking at the review you shared right away.
 
0 on that gpu but maybe whenever you upgrade to somthing faster than a 5090.

I try to avoid boards with this sort of setup but it is what it is.

You'll likely be fine though for a long time.
 
0 on that gpu but maybe whenever you upgrade to somthing faster than a 5090.

I try to avoid boards with this sort of setup but it is what it is.

You'll likely be fine though for a long time.
If I switch to the 5000 series, I will switch to the 5070 Ti or 5070 Super version if it comes out.

When I bought the B650E-E, I bought it because many of its features were enough for me. And I knew there would be band throttling in the x16 slot when I used all the m2. But there was no reason for me to buy the X670E. My previous motherboard was X570 Crosshair. Even though I changed the model, I didn't mind because the B650E-E was a better motherboard than my previous motherboard. The upper models of the X670Es are very expensive. (Crosshair, Extreme, etc.) I allocated the budget I would give to it for the graphics card.

Top tier cards are beyond me. I think, as you said, I won't have bandwidth problems for a long time.
 
It shows up in benchmarks even on a lower end GPU. Can you notice? Not really.
 
PCIe 3.0 x16/4.0 x8 offers 16 gigabytes of bandwidth per second. The fastest PCIe 5.0 SSDs can move 12GB/sec. Do the math.
 
If I switch to the 5000 series, I will switch to the 5070 Ti or 5070 Super version if it comes out.

When I bought the B650E-E, I bought it because many of its features were enough for me. And I knew there would be band throttling in the x16 slot when I used all the m2. But there was no reason for me to buy the X670E. My previous motherboard was X570 Crosshair. Even though I changed the model, I didn't mind because the B650E-E was a better motherboard than my previous motherboard. The upper models of the X670Es are very expensive. (Crosshair, Extreme, etc.) I allocated the budget I would give to it for the graphics card.

Top tier cards are beyond me. I think, as you said, I won't have bandwidth problems for a long time.

The Crosshair 8 was such a great motherboard in it's day though and even though boards in the 300-400 usd range have surpassed it overall I still feel like you have to decide what features you're ok living without.

With Asus you are always going to pay a premium if you want extra stuff comes with the territory they do make nice boards at the strix level and above though.

My X670E was free so I didn't really get to pick would have probably picked something else lol but it's been fine.

But yeah in normal usage you aren't going to notice my mention of somthing faster than the 5090 was just if or when you get to that point it may or may not be an issue bit you'll likely upgrade platforms before that anyway.

And a theoretically gen 5 pcie gpu will likely be fine even at X8
 
16GB VRAM certainly help out alot here because of less RAM/VRAM swapping over the PCIe bus

Overall the difference will be 1-2%
 
Hello everyone;

AMD Ryzen 9 7900X3D
Asus ROG Strix B650E-E Gaming
and Asus RTX TUF RTX4070TI Super,
I have a system consisting of main components.

There are a total of 4 Nvme M2 slots on my motherboard. 2 of them support PCIE 5.0, and when you insert an SSD into the second m2 slot, the GPU slot decreases from X16 to X8 (Pcie Gen 5). How much will I lose in performance as a result of decreasing this bus from X16 to X8?
I want to learn this.
My graphics card supports PCIE Gen4. The motherboard is Gen 5.
But when the bus decreases from X16 to X8, this also corresponds to the same bus width as X16 Gen3.
Most graphics cards do not use X16 Gen4 bandwidth. I know that.
I know that even Gen3 16x is more than enough.
But I would still be happy if those who know could respond.
Step 1.
Run Heaven or TimeSpy with the drive in.
Step 2.
Remove the drive and run Heaven or TimeSpy
Step 3.
Compare the scores

something, something, profit?
 
Im running dual Optane 900ps in my two empty slots. This takes my cpu (4070 Super) to 8x lane. Im still over 250fps in COD, which is all I need. I say do some testing with different games and software and see if it effects it too much. My thoughts are it wont.
 
maybe 1%… i have a 7900xtx that is stuck at pcie4x8, because the card edge has a few components knocked of (the 11th lane transmitter) so the card automatically fails the pre-boot checkout so defaults to 8x pcie4.
lls, no difference that i can tell, all my benchmarks are within .5 percent of published benckmarks.
i might lose in max 3dmark scores, in a 16x vs 8x in 4k.
i guess if you are playing at 1080p or 1440p, the frame rate might be 130 vs 135 fps, in otherwords does not matter.
when i search for 8x vs 16x seemed that it did not matter… maybe 1%.
i hate the fact that my 1300 dollar card is “challenged”, but their are no problems that i can see. also 8xPCIE4 is equal to 16xPCIE3…
then again Nvidia might be different than AMD.
 
The Crosshair 8 was such a great motherboard in it's day though and even though boards in the 300-400 usd range have surpassed it overall I still feel like you have to decide what features you're ok living without.

With Asus you are always going to pay a premium if you want extra stuff comes with the territory they do make nice boards at the strix level and above though.

My X670E was free so I didn't really get to pick would have probably picked something else lol but it's been fine.

But yeah in normal usage you aren't going to notice my mention of somthing faster than the 5090 was just if or when you get to that point it may or may not be an issue bit you'll likely upgrade platforms before that anyway.

And a theoretically gen 5 pcie gpu will likely be fine even at X8
I know that even the most powerful graphics cards currently cannot fill Gen3 x16. I don't think this will be possible at all for Gen5 x16. As you probably said, a Gen5 GPU and a Gen5 motherboard will work perfectly even at x8 speed. Frankly, I took this into consideration when purchasing the B650E-E. I thought that if Gen4 x8 was more than enough, Gen5 x8 would already be enough.
Just out of curiosity, I wanted to ask here if the 4070TI S would experience any losses.

By the way, CH8 was a really nice motherboard. I used it very well with the 5900x for almost 4 years without any problems.

16GB VRAM certainly help out alot here because of less RAM/VRAM swapping over the PCIe bus

Overall the difference will be 1-2%
I looked at some general reviews on YouTube. The performance difference is as you say.

Step 1.
Run Heaven or TimeSpy with the drive in.
Step 2.
Remove the drive and run Heaven or TimeSpy
Step 3.
Compare the scores

something, something, profit?
I was going to do what you said, but since I have completely assembled and placed the case, I honestly do not want to bother with disassembling the M2s. But I looked at some reviews on YouTube and the performance difference is around 1-2%. This is a very insignificant difference.

Im running dual Optane 900ps in my two empty slots. This takes my cpu (4070 Super) to 8x lane. Im still over 250fps in COD, which is all I need. I say do some testing with different games and software and see if it effects it too much. My thoughts are it wont.
1-2% loss is really meaningless when you are getting that high fps. I understand that Gen4 x8 / Gen3 x16 is more than enough for these cards. I looked at both the review shared here and on YouTube and saw the difference. It's a very negligible performance difference.

maybe 1%… i have a 7900xtx that is stuck at pcie4x8, because the card edge has a few components knocked of (the 11th lane transmitter) so the card automatically fails the pre-boot checkout so defaults to 8x pcie4.
lls, no difference that i can tell, all my benchmarks are within .5 percent of published benckmarks.
i might lose in max 3dmark scores, in a 16x vs 8x in 4k.
i guess if you are playing at 1080p or 1440p, the frame rate might be 130 vs 135 fps, in otherwords does not matter.
when i search for 8x vs 16x seemed that it did not matter… maybe 1%.
i hate the fact that my 1300 dollar card is “challenged”, but their are no problems that i can see. also 8xPCIE4 is equal to 16xPCIE3…
then again Nvidia might be different than AMD.
I also looked at the gen4 x8 and gen3 x16 tests for RTX4090 on the internet. The difference is so low that there is an average loss of around 2%. And this already seems very insignificant for a card that gets high FPS.
Therefore, I came to the conclusion that it does not make a difference whether it is gen4 x8 or x16.
It already helps me get pretty high FPS.
 
Back
Top