1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon Fury X PCI-Express Scaling

Discussion in 'Reviews' started by W1zzard, Nov 3, 2015.

  1. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    16,831 (3.45/day)
    Thanks Received:
    17,662
    In this article, we investigate how performance of AMD's Radeon Fury X is affected when running on constrained PCI-Express bus widths such as x8 or x4. We also test all PCIe speed settings, 1.1, 2.0, and 3.0. One additional test checks how much performance is lost when using the chipset's PCIe x4 slot.

    To read this article go to: https://www.techpowerup.com/reviews/AMD/R9_Fury_X_PCI-Express_Scaling/
     
    Last edited by a moderator: Nov 3, 2015
    10 Year Member at TPU
  2. Ferrum Master

    Ferrum Master

    Joined:
    Nov 18, 2010
    Messages:
    3,707 (1.48/day)
    Thanks Received:
    2,166
    Location:
    Rīga, Latvia
    For fun, does the overall system power consumption change when using pcie2 vs 3?
     
    Crunching for Team TPU
  3. Assimilator

    Assimilator

    Joined:
    Feb 18, 2005
    Messages:
    1,189 (0.26/day)
    Thanks Received:
    546
    Location:
    South Africa
    I was hoping for this test run with CrossFire Fury X, to see how much bandwidth XDMA consumes and how badly CF scaling is affected by lower speeds.
     
    10 Year Member at TPU
  4. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    16,831 (3.45/day)
    Thanks Received:
    17,662
    I'll do 970 SLI next, after that maybe Fury X CF, if I can find a second card..
     
    manofthem and Assimilator say thanks.
    10 Year Member at TPU
  5. FW1374

    Joined:
    Mar 10, 2006
    Messages:
    11 (0.00/day)
    Thanks Received:
    0
    IIRC Wolfenstein had interesting results last time. Maybe add this game to 970 test?
     
    10 Year Member at TPU
  6. GhostRyder

    GhostRyder

    Joined:
    Apr 29, 2014
    Messages:
    3,674 (2.95/day)
    Thanks Received:
    2,106
    Location:
    Texas
    This is a great read as its good to get some answers to questions many people ask when it comes to PCIE scaling.

    This still shows us how as long as you have Sandy-Bridge or higher (Heck even chips like the i7 920) you can still have games maxed out without worrying about PCIE performance scaling.
     
  7. Basard

    Basard

    Joined:
    Dec 15, 2006
    Messages:
    1,076 (0.27/day)
    Thanks Received:
    282
    Location:
    Oshkosh, WI
    Cool... So I'm all good to go for another few years with my 770 chipset then.:laugh: (edit) Seeing GhostRyders post makes me cry now.

    I like the new look of the charts by the way. :lovetpu:
     
    10 Year Member at TPU Crunching for Team TPU
  8. Joss

    Joss

    Joined:
    Sep 10, 2014
    Messages:
    285 (0.26/day)
    Thanks Received:
    114
    Thank you.
    This kind of tests define what a tech site is, and this is a tech site.
     
  9. heydan83

    heydan83

    Joined:
    Apr 24, 2014
    Messages:
    52 (0.04/day)
    Thanks Received:
    29
    Thanks for this great article!
     
  10. Hayder_Master

    Hayder_Master

    Joined:
    Apr 21, 2008
    Messages:
    5,229 (1.52/day)
    Thanks Received:
    668
    Location:
    IRAQ-Baghdad
    great review W1zzard, any tests for GTX980ti? or maybe you tell me what you guess ? i think same as this
     
  11. night.fox

    night.fox

    Joined:
    Dec 23, 2012
    Messages:
    1,389 (0.80/day)
    Thanks Received:
    939
    Location:
    south korea
    @W1zzard can you do PCI scaling on a board which has PLX chips? I wonder the impact of PLX latencies with regards to crossfire/SLI on non-plx vs PLX board

    Anyway, thanks for the test and great review.
     
  12. geon2k2

    Joined:
    Apr 18, 2015
    Messages:
    183 (0.21/day)
    Thanks Received:
    51
    Please please please ... I'm very much interested to see what happens now that the crossfire bridge is gone. There are many users out there which have 16x+4x slots (both on 2.0 and on 3.0).
     
  13. Ubersonic

    Joined:
    Nov 5, 2014
    Messages:
    410 (0.39/day)
    Thanks Received:
    152
    Don't Nvidia restrict SLI to only 8x and 16x slots?
     
  14. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    16,831 (3.45/day)
    Thanks Received:
    17,662
    I think so, but with PCIe speeds 1.1, 2.0 and 3.0 there should be sufficient ways to test this
     
    10 Year Member at TPU
  15. Aquinus

    Aquinus Resident Wat-man

    Joined:
    Jan 28, 2012
    Messages:
    10,248 (4.96/day)
    Thanks Received:
    5,321
    Location:
    Concord, NH
    I think in reality, PCI-E scaling is more of a thing in CFX/SLI setups. It's 95% of the time, if someone has a PCI-E slot for a single graphics card that it's going to be a 16 lane slot with the exception of some very low end boards that only have 4 lane slots which would already be gimped by the platform and CPU.

    What I want to see is the difference for dual GPU when its 16/16, 8/8, and 8/4 which seem to be the most common configurations for two GPUs.
     
    RealNeil says thanks.
  16. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    46,006 (9.71/day)
    Thanks Received:
    13,642
    Location:
    Australalalalalaia.
    one thing that stands out, theres a large amount of detail on intel platforms supporting PCI-E 3.0 in the first page, but nothing about AMD supporting it. seems odd to have such details on intel, yet forget the competition.

    edit: poor wording. I'm aware that AMD has piss all support for it on the chipset side, but a comment on that seems like it belongs in this type of article.
     
    Last edited: Nov 4, 2015
    10 Year Member at TPU
  17. Aquinus

    Aquinus Resident Wat-man

    Joined:
    Jan 28, 2012
    Messages:
    10,248 (4.96/day)
    Thanks Received:
    5,321
    Location:
    Concord, NH
    AM3/+ doesn't really support PCI-E 3.0 in any meaningful way which only leaves APUs. On the other hand, my 3820 is currently running my 390 using PCI-E 3.0 which just gives you an idea for how long Intel has had it available.
     
  18. Ferrum Master

    Ferrum Master

    Joined:
    Nov 18, 2010
    Messages:
    3,707 (1.48/day)
    Thanks Received:
    2,166
    Location:
    Rīga, Latvia
    Yea but the funny thing is that AMD was right holding that off as really PCIE3 is not needed... especially for their budget-mid platform offerings. And these tests prove it.
     
    Crunching for Team TPU
  19. Aquinus

    Aquinus Resident Wat-man

    Joined:
    Jan 28, 2012
    Messages:
    10,248 (4.96/day)
    Thanks Received:
    5,321
    Location:
    Concord, NH
    AMD didn't hold off. AM3/+ is just a dead platform. A lot of newer APUs have PCI-E 3.0, in particular the 7xxx series of APUs.
     
  20. FourtyTwo

    Joined:
    Sep 16, 2014
    Messages:
    20 (0.02/day)
    Thanks Received:
    4
    Excellent reference article.
     
  21. Ferrum Master

    Ferrum Master

    Joined:
    Nov 18, 2010
    Messages:
    3,707 (1.48/day)
    Thanks Received:
    2,166
    Location:
    Rīga, Latvia
    Yeah but it took them few years to have it at all.
     
    Crunching for Team TPU
  22. Legacy-ZA

    Legacy-ZA

    Joined:
    Dec 14, 2011
    Messages:
    176 (0.08/day)
    Thanks Received:
    70
    Location:
    South-Africa
    This was a very interesting article, thank you for making it. I have to admit, I was expecting way more bad performance on the older PCI-E lanes, but to be honest, it really isn't that bad at all.
     
  23. TheDeeGee

    TheDeeGee

    Joined:
    Aug 2, 2012
    Messages:
    350 (0.19/day)
    Thanks Received:
    94
    Location:
    Netherlands
    Still no worries between 8x and 16x 3.0 it seems.

    In my current setup i'm limited to 8x 3.0 because of my dedicated Sound Card. I could put it in the 1x Slot, but then it's creamed on top of my Videocard and i'm not really fond of that.
     
  24. BiggieShady

    BiggieShady

    Joined:
    Feb 8, 2012
    Messages:
    2,616 (1.27/day)
    Thanks Received:
    1,885
    Location:
    Zagreb, Croatia
    Most pcie traffic in games is done during the loading screen and of course there's no point measuring performance of a progress bar.
    So, during the actual gameplay pcie traffic spikes when streaming high res texture mip levels and/or geometry LOD.
    Unreal Engine uses texture streaming extensively and streaming is multi threaded ... one would think it shouldn't be a problem to saturate pcie bus this way on a old pcie 1.1 dual cpu mobo with xeons.
    The thing is unreal engine (and probably others too) have system in check to monitor bandwidth used for texture streaming, and a mechanism to optimally use available pcie bandwidth in order to avoid stuttering.
    In Unreal Engine games you can type in console "stat StreamingDetails" and see bandwidth that streaming uses.
    It would be interesting to see if those values exceed any of the pcie modes, and if it's adjusted to work stutter free on all modes.
     
    Aquinus says thanks.
  25. GC_PaNzerFIN

    GC_PaNzerFIN

    Joined:
    Oct 9, 2009
    Messages:
    674 (0.23/day)
    Thanks Received:
    658
    Location:
    Finland
    How does the bandwidth affect minimum fps? I remember in the good old days at least there was more significant change in minimum fps, not so much in average. More dips in FPS when you definitely didn't want.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)