NVIDIA recently launched its GeForce RTX 20-series graphics cards based on its new "Turing" architecture with two high-end parts: the GeForce RTX 2080 Ti and RTX 2080. Despite a long list of innovations, NVIDIA chose to give these chips a PCI-Express 3.0 x16 bus interface, even though the PCI-Express gen 4.0 specification has been published over a year ago and could be implemented in desktop platforms in the next few months. Rival AMD is rumored to be implementing PCI-Express gen 4.0 in its next-generation GPU.
The PCI-Express bus interface has endured close to two decades of market dominance thanks to its scalable design, backwards compatibility, and near-doubling in data bandwidth every five years or so. PCI-Express generation 3, introduced in 2011, has seen NVIDIA and AMD launch four generations of GPUs on it, with each failing to saturate it at full x16 bus width. That, coupled with the decline in multi-GPU beyond two graphics cards, has blunted the overall connectivity edge the high-end desktop (HEDT) platform had over mainstream-desktop.
We have a tradition of testing PCI-Express bus utilization and scaling each time a new graphics architecture from either company is launched. We do so by testing a new generation graphics card's performance on various configurations of the PCI-Express bus by narrowing its bus width (number of lanes) and limiting its bandwidth to that of older generations, giving us valuable data to draw inferences on a number of things. It lets us know if the new GeForce RTX 2080 Ti can be bottlenecked in multi-GPU setups on mainstream desktop platforms and whether it's time you changed your old motherboard that uses older generations of PCI-Express. It will also help answer questions like "Will my graphics card run slower when using PCIe x8?" and "Do I need to run my SLI in x16, buying the more expensive X299 platform, for the best performance?"
In this review, we are taking the fastest graphics card from NVIDIA, the GeForce RTX 2080 Ti Founders Edition, and test it across PCI-Express 3.0 x16 (the most common configuration for single-GPU builds), PCI-Express 3.0 x8 (bandwidth comparable to PCI-Express 2.0 x16), PCI-Express 3.0 x4 (comparable to PCI-Express 2.0 x8), and for purely academic reasons, PCI-Express 2.0 x4 (what would happen if you installed your card in the bottom-most slot of your motherboard). The table below gives you an idea of the theoretical maximum bandwidths of the common PCI-Express configurations:
For all our PCI-Express bandwidth testing, we limit the bus-width by physically blocking the slot wiring for lanes using insulating tape. The modular design of PCI-Express allows for this. The motherboard BIOS lets us limit the PCI-Express feature set to that of older generations, too. We put the card in its various PCI-Express configurations through our entire battery of graphics card benchmarks, all of which are real-world game tests.
Our exhaustive coverage of the NVIDIA GeForce RTX 20-series "Turing" debut also includes the following reviews: NVIDIA GeForce RTX 2080 Ti Founders Edition 11 GB | NVIDIA GeForce RTX 2080 Founders Edition 8 GB | ASUS GeForce RTX 2080 Ti STRIX OC 11 GB | ASUS GeForce RTX 2080 STRIX OC 8 GB | Palit GeForce RTX 2080 Gaming Pro OC 8 GB | MSI GeForce RTX 2080 Gaming X Trio 8 GB | MSI GeForce RTX 2080 Ti Gaming X Trio 11 GB | MSI GeForce RTX 2080 Ti Duke 11 GB | NVIDIA RTX and Turing Architecture Deep-dive
Benchmark scores in other reviews are only comparable when this exact same configuration is used.
|Test System - VGA Rev. 2018.2|
|Processor:||Intel Core i7-8700K @ 4.8 GHz|
(Coffee Lake, 12 MB Cache)
|Motherboard:||ASUS Maximus X Hero|
|Memory:||G.SKILL 16 GB Trident-Z DDR4 |
@ 3867 MHz 18-19-19-39
|Storage:||2x Patriot Ignite 960 GB SSD|
|Power Supply:||Seasonic Prime Ultra Titanium 850 W|
|Cooler:||Cryorig R1 Universal 2x 140 mm fan|
|Software:||Windows 10 64-bit April 2018 Update|
|Drivers:||GeForce 411.51 Press Driver|
|Display:||Acer CB240HYKbmjdpr 24" 3840x2160|
- All games and cards were tested with the drivers listed above, and no performance results were recycled between test systems. Only this exact system with exactly the same configuration was used.
- All games are set to their highest quality setting unless indicated otherwise.
- AA and AF are applied via in-game settings, not via the driver's control panel.
- 1920x1080: Most common monitor (22" - 26").
- 2560x1440: Highest possible 16:9 resolution for commonly available displays (27"-32").
- 3840x2160: 4K Ultra HD resolution, available on the latest high-end monitors.
Assassin's Creed Origins
Our Patreon Silver Supporters can read articles in single-page format.