AMD Radeon Fury X PCI-Express Scaling 34

AMD Radeon Fury X PCI-Express Scaling

(34 Comments) »

Introduction


It's been one year since our last PCI-Express scaling article in which we used GeForce GTX 980s. Not much has happened since then in terms of slot technology and bandwidth. Just like in recent years, almost everybody is using PCI-Express x16 3.0, and while Intel's Enthusiast Platform (Haswell-E) has plenty of lanes for even multiple VGA cards, it's the exception as every other motherboard still splits the x16 3.0 slot into two x8 3.0 slots for a dual-GPU setup.

PCI-Express 3.0 saw its debut on Intel's Sandy Bridge-E processors, but was adopted into the mainstream when the Ivy Bridge architecture was released in 2012. Since then, PCI-E 3.0 has been used on Haswell and now Skylake.



While PCI-Express 1.0 pushes 250 MB/s per direction, PCI-Express 2.0 pushes 500 MB/s, and PCI-Express 3.0 doubles that to 1 GB/s. The resulting absolute bandwidth of PCI-Express 3.0 x16, 32 GB/s, might seem like overkill, but the ability to transfer that much data per lane could come to the rescue of 8-lanes (x8) and 4-lanes (x4) configurations.

In this review, we test the impact of running AMD's Radeon Fury X on our new Windows 10 test bench using electrical Intel Skylake PCI-Express x16, x8, and x4 slots. We made a point of testing all three generations of the PCI-Express interface: 1.1, 2.0, and 3.0.



To adjust the number of lanes available to the graphics card, we used common plastic adhesive tape to cover up and disable lanes, mimicking the different slot widths.

For the graphs on following pages, we've colored same-bandwith configurations in the same color, as an easy visual reference. PCIe x8 3.0, for example, offers just as much bandwidth as PCIe x16 2.0, which is why both are of the same color.

We also tested PCI-Express x4 3.0 bandwidth via the chipset separately, which usually powers the third PCIe slot on most motherboards.

Test System

Test System - VGA Rev. 40
Processor:Intel Core i7-6700K @ 4.5 GHz
(Skylake, 8192 KB Cache)
Motherboard:ASUS Maximus VIII Hero
Intel Z170
Memory:G.SKILL 16 GB Trident-Z DDR4
@ 3000 MHz 15-16-16-35
Storage:WD Caviar Blue WD10EZEX 1 TB
Power Supply:Antec HCP-1200 1200W
Cooler:Cryorig R1 Universal 2x 140 mm fan
Software:Windows 10 64-bit
Drivers:NVIDIA: 358.50 WHQL
AMD: Catalyst 15.9.1 Beta
Display: Acer CB240HYKbmjdpr 24" 3840x2160
Benchmark scores in other reviews are only comparable when this exact same configuration is used.
  • All video card results are obtained on this exact system with exactly the same configuration.
  • All games are set to their highest quality setting unless indicated otherwise.
  • AA and AF are applied via in-game settings, not via the driver's control panel.
Each game is tested at these screen resolutions:
  • 1600x900: Common resolution for most smaller flatscreens and laptops today (17" - 19").
  • 1920x1080: Most common widescreen resolution for larger displays (22" - 26").
  • 2560x1440: Highest possible 16:9 resolution for commonly available displays (27"-32").
  • 3840x2160: 4K Ultra HD resolution, available on the latest high-end monitors.

Assassin's Creed: Unity


The latest entry to Ubisoft's smash-hit stealth sandbox franchise, Assassin's Creed Unity introduces cooperative multiplayer gameplay in which up to four players can cruise through missions and explore an open world, giving you innovating new ways to approach a mission. Spanning several settings, such as the French Revolution and the Nazi occupation of France, the game springs up some stunning visual details and is one of the most resource heavy games on the market today. Based on Ubisoft's latest AnvilNext engine, it takes advantage of DirectX 11. The release has been plagued by loads of bugs and performance issues, making this game's launch the most problematic one in recent history.

Our Patreon Silver Supporters can read articles in single-page format.
Discuss(34 Comments)
May 6th, 2024 06:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts