• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Modders Pull Off 16GB GeForce RTX 2080 Upgrade, Modded Card Posts 8% Performance Boost

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,684 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Brazilian tech enthusiast Paulo Gomes, in association with Jefferson Silva, and Ygor Mota, successfully modded an EVGA GeForce RTX 2080 "Turing" graphics card to 16 GB. This was done by replacing each of its 8 Gbit GDDR6 memory chips with ones that have double the density, at 16 Gbit. Over the GPU's 256-bit wide memory bus, eight of these chips add up to 16 GB. The memory speed was unchanged at 14 Gbps reference, as were the GPU clocks.

The process of modding involves de-soldering each of the eight 8 Gbit chips, clearing out the memory pads of any shorted pins, using a GDDR6 stencil to place replacement solder balls, and then soldering the new 16 Gbit chips onto the pad under heat. Besides replacing the memory chips, a series of SMD jumpers need to be adjusted near the BIOS ROM chip, which lets the GPU correctly recognize the 16 GB memory size. The TU104 silicon by default supports higher density memory, as NVIDIA uses this chip on some of its professional graphics cards with 16 GB memory, such as the Quadro RTX 5000.



This is it, nothing to be done on the software side, and TechPowerUp GPU-Z should show you the detected memory size. A "Resident Evil 4" benchmark run with a performance overlay shows that the game is utilizing almost 9.8 GB of video memory compared to 7.7 GB on the original card, and is posting a 7.82% performance increase, from 64 FPS on average, to 69 FPS. This is greater than the kind of performance deltas you see between the 8 GB and 16 GB variants of the RTX 4060 Ti. Besides gaming performance increases, the 16 GB memory should significantly improve the generative AI performance of the RTX 2080.

View at TechPowerUp Main Site | Source
 
The question is, where did they get the schematic from as you need to know which jumpers to move.

This is a successful mod unlike the the last one posted in the news on TECHPOWER which ended up with the card artifacting after mod. ..Welldone to those involve in the mod.
 
I also thought that the drivers would not work or not recognise the extra, but from what I gather it's linked to the quadro cards of the same silicon allowing for the higher memory amount to be accessed?
 
I also thought that the drivers would not work or not recognise the extra, but from what I gather it's linked to the quadro cards of the same silicon allowing for the higher memory amount to be accessed?

Interestingly you should say that, because I would of thought change of firmware will also be required.
 
that is a cool mod... and it shows the gpu architecture, clock speed etc can utilize the extra ram for a better gameplay experience in today's games
 
I have my doubts about the "performance uplift" being the result of the memory mod, at least not solely:
First off the GPU-Z screenshots were taken on different windows installs (win 11 vs win 10).
Secondly in the gaming screenshots the 16 GB version is 11 °C (!!!) cooler.
Finally and imo most importantly the ingame scene is different with the modded version just staring straigth at a wall, while the other screenshot is in motion.
 
that is a cool mod... and it shows the gpu architecture, clock speed etc can utilize the extra ram for a better gameplay experience in today's games
I'd have thought so too... maybe modded drivers?
I have my doubts about the "performance uplift" being the result of the memory mod, at least not solely:
First off the GPU-Z screenshots were taken on different windows installs (win 11 vs win 10).
Secondly in the gaming screenshots the 16 GB version is 11 °C (!!!) cooler.
Finally and imo most importantly the ingame scene is different with the modded version just staring straigth at a wall, while the other screenshot is in motion.
Very good pickups, I forget the increments but a few architectures now in a row have scaled speed directly with temperature, like 2 identical cards, one at 45c load and one at 75c load, will be separated by a good 30-90mhz. It's something like 15mhz every 8c with Ampere? I'm not certain of the specific amount, but assuming Turing works the same, the temperature could easily account for some of that performance delta, along with other variables you mentioned and perhaps even more.
 
Interesting mod.
 
Hmm, I believe the driver isn't utilizing the additional VRAM; it likely remains in conservative mode, constantly attempting to restrict itself to the original 8GB of VRAM.

I'm curious whether the GPU might become unstable if faster memory chips (18Gbps) were installed.
 
I have my doubts about the "performance uplift" being the result of the memory mod, at least not solely:
First off the GPU-Z screenshots were taken on different windows installs (win 11 vs win 10).
Secondly in the gaming screenshots the 16 GB version is 11 °C (!!!) cooler.
Finally and imo most importantly the ingame scene is different with the modded version just staring straigth at a wall, while the other screenshot is in motion.
Watch the video

Hmm, acredito que o driver não está utilizando VRAM adicional; provavelmente permanece no modo conservador, tentando constantemente restringir-se aos 8 GB originais de VRAM.

Estou curioso para saber se a GPU pode ficar instável se chips de memória mais rápidos (18 Gbps) for
Watch the video
 
The question is, where did they get the schematic from as you need to know which jumpers to move.

This is a successful mod unlike the the last one posted in the news on TECHPOWER which ended up with the card artifacting after mod. ..Welldone to those involve in the mod.
The catch is if one is an actual engineer it's not that hard to figure out.
 
Just wondering..... why? Wouldn't it be wiser to mod newer cards, or are those so locked that it's practically impossible?

The question is, where did they get the schematic from as you need to know which jumpers to move.
My wild guess is pure trial and error.
 
I wonder if I could make my 12GB card into a 24GB then perhaps install a 3090 BIOS.
 
mad soldering skillz right there
 
I wonder if I could make my 12GB card into a 24GB then perhaps install a 3090 BIOS.
Even though both use GA102 silicon, they aren't the exact same. The 3080 has more shaders fused off than the 3090, so you couldn't just add more memory chips and then use the 3090 BIOS since it isn't the exact same chip.
 
What was the first Nvidia graphics card with more than 12GB (frigin, 11GB) of VRAM? Nvidia fan boys love to complain about AMD though at least AMD doesn't go out of their way to intentionally screw customers on VRAM in a bid to try to force people to upgrade...frigin, same-generation now. I max out 16GB often enough, the memory interfaces on those 4070s must be where all the mileage is at on those cards. Nvidia could release cards with enough VRAM though because people keep buying garbage they'll happily keep selling it. :kookoo:
 
nGreedia is watching this situation. Wanna bet they will start artificially limiting their cards through the drivers to not recognise cards that have been modded if this catches on!
 
Back
Top