• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Responds to GTX 970 Memory Allocation 'Bug' Controversy

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,277 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
The GeForce GTX 970 memory allocation bug discovery, made towards last Friday, wrecked some NVIDIA engineers' weekends, who composed a response to what they tell is a non-issue. A bug was discovered in the way GeForce GTX 970 was allocating its 4 GB of video memory, giving some power-users the impression that the GPU isn't addressing the last 700-500 MB of its memory. NVIDIA, in its response, explained that the GPU is fully capable of addressing its 4 GB, but does so in an unusual way. Without further ado, the statement.

The GeForce GTX 970 is equipped with 4GB of dedicated graphics memory. However the 970 has a different configuration of SMs than the 980, and fewer crossbar resources to the memory system. To optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section. The GPU has higher priority access to the 3.5GB section. When a game needs less than 3.5GB of video memory per draw command then it will only access the first partition, and 3rd party applications that measure memory usage will report 3.5GB of memory in use on GTX 970, but may report more for GTX 980 if there is more memory used by other commands. When a game requires more than 3.5GB of memory then we use both segments.

Continued
We understand there have been some questions about how the GTX 970 will perform when it accesses the 0.5GB memory segment. The best way to test that is to look at game performance. Compare a GTX 980 to a 970 on a game that uses less than 3.5GB. Then turn up the settings so the game needs more than 3.5GB and compare 980 and 970 performance again.

Here's an example of some performance data:

<table class="tputbl hilight" cellspacing="0" cellpadding="3">
<caption>
GTX 970 vs. GTX 980 Memory-Intensive Performance Data
</caption>
<tr>
<th scope="col">&nbsp;</th>
<th scope="col">GeForce <br />
GTX 980</th>
<th scope="col">GeForce <br />
GTX 970</th>
</tr>
<tr>
<th scope="row">Shadow of Mordor</th>
<td align="right"></td>
<td align="right"></td>
</tr>
<tr class="alt">
<th scope="row">:love:.5GB setting = 2688x1512 Very High</th>
<td align="right">72 fps</td>
<td align="right">60 fps</td>
</tr>
<tr>
<th scope="row">>3.5GB setting = 3456x1944</th>
<td align="right">55fps (-24%)</td>
<td align="right">45fps (-25%)</td>
</tr>
<tr class="alt">
<th scope="row">Battlefield 4</th>
<td align="right"></td>
<td align="right"></td>
</tr>
<tr>
<th scope="row">:love:.5GB setting = 3840x2160 2xMSAA</th>
<td align="right">36 fps</td>
<td align="right">30 fps</td>
</tr>
<tr class="alt">
<th scope="row">>3.5GB setting = 3840x2160 135% res</th>
<td align="right">19fps (-47%)</td>
<td align="right">15fps (-50%)</td>
</tr>
<tr>
<th scope="row">Call of Duty: Advanced Warfare</th>
<td align="right"></td>
<td align="right"></td>
</tr>
<tr class="alt">
<th scope="row">:love:.5GB setting = 3840x2160 FSMAA T2x, Supersampling off</th>
<td align="right">82 fps</td>
<td align="right">71 fps</td>
</tr>
<tr class="alt">
<th scope="row">:love:.5GB setting = >3.5GB setting = 3840x2160 FSMAA T2x, Supersampling on</th>
<td align="right">48fps (-41%)</td>
<td align="right">40fps (-44%)</td>
</tr>
</table>

On GTX 980, Shadows of Mordor drops about 24% on GTX 980 and 25% on GTX 970, a 1% difference. On Battlefield 4, the drop is 47% on GTX 980 and 50% on GTX 970, a 3% difference. On CoD: AW, the drop is 41% on GTX 980 and 44% on GTX 970, a 3% difference. As you can see, there is very little change in the performance of the GTX 970 relative to GTX 980 on these games when it is using the 0.5GB segment.

View at TechPowerUp Main Site
 
Last edited:
Joined
Feb 18, 2011
Messages
1,259 (0.26/day)
They really should have gone the other way around with this whole story. The 970 is still a hell of a card. If they would have said that it only has 3.5GB, it would be all fine now.
They could have just „leaked” how to enable the last 0.5GB and enthusiast would all try it and argue and debate about it on every tech forums if it’s worth it or not because of the performance hit (which is about 2-3fps?)

There would be no drama now then.
 
Joined
Dec 9, 2007
Messages
746 (0.13/day)
I don't see what the big deal is. :confused: The 970 still performs within spitting distance of the 980, for a lot less $. All of those have their panties twisted over some non-standard memory configuration need to go back and read the 970 reviews again. This discovery does not suddenly make the 970 slower than it was a month ago. Besides, if you wanted higher performance, you'd have to open up your wallet, there's no way around this, especially with GM200 on the horizon.
 
Joined
May 21, 2009
Messages
224 (0.04/day)
On NVIDIA's own forums NVIDIA has stated more information will be forthcoming and that this explanation was to let everyone understand 'how' the memory allocation works.
 

Pasha

New Member
Joined
Apr 6, 2013
Messages
12 (0.00/day)
Location
Slovenia
They really should have gone the other way around with this whole story. The 970 is still a hell of a card. If they would have said that it only has 3.5GB, it would be all fine now.
They could have just „leaked” how to enable the last 0.5GB and enthusiast would all try it and argue and debate about it on every tech forums if it’s worth it or not because of the performance hit (which is about 2-3fps?)

There would be no drama now then.

Fps is ok, u got sttuter when using all memory.
 
Joined
Feb 18, 2011
Messages
1,259 (0.26/day)
Fps is ok, u got sttuter when using all memory.
Yes, but it really depends on the actual scenario and the engine in question. If it's allocated as texture memory for example, most modern engines would just draw the frame without waiting for the texture to be streamed. But yes, you would get stutter if it's a frame buffer, a surface or compute memory, etc... it can be bad I agree.
 
Joined
Oct 2, 2004
Messages
13,791 (1.94/day)
Cranking up settings is imo not the way it should be tested, because you change conditions. Make textures occupy more memory at same resolutions...
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
 
Joined
Aug 22, 2014
Messages
39 (0.01/day)
System Name Bird's Monolith
Processor Intel i7-4770k 4.6 GHz (liquid metal)
Motherboard Asrock Z87 Extreme3
Cooling Noctua NH-D14, Noctua 140mm Case Fans
Memory 16 GB G-Skill Trident-X DDR3 2400 CAS 9
Video Card(s) EVGA 1080ti SC2 Hybrid
Storage 2 TB Mushkin Triactor 3D (RAID 0)
Display(s) Dell S2716DG / Samsung Q80R QLED TV
Case Fractal Design Define R4
Audio Device(s) Audio Engine D1 DAC, A5+ Speakers, SteelSeries Arctis 7
Power Supply Seasonic Platinum 660 W
Mouse SteelSeries Rival
Keyboard SteelSeries Apex
Software Windows 10 Pro x64
Nvidia is a competent company. They're likely prioritizing latency sensitive tasks to the 3.5GB segment and keeping lower priority memory in the 0.5GB segment. I mean, their logic is solid, the only time you'll really notice any degradation is when a single frame requires MORE than the entirety of the 3.5GB segment for a single frame, because the software has to work out some extra overhead with virtual addressing.

The likelihood of any game needing more than 3.5GB for a single frame is pretty low for where most people game. Nvidia has to make some tradeoffs in their architecture to achieve their massive efficiency gains over Kepler, tying the processors addressing capabilities to SMs is one of those tradeoffs. If anything, I think it was a smart tradeoff, it's taken this long for people to even notice there was a compromise made at all.

I think at the end of the day people are going to say, well this is one hell of a card still for 95% of gamers out there, and the other 5% that want to squeeze every single performance percentage out of 4k and multimonitor setups are just gonna throw down cash on a GTX 980(s) or 290X(s).
 
Joined
May 4, 2012
Messages
412 (0.09/day)
Processor Intel i7 10700K
Motherboard Gigabyte Z490 Aorus Pro AX
Cooling Be Quiet Dark Rock Pro 4
Memory DDR4 Corsair Vengeance LPX 3200MHZ (2X16GB)
Video Card(s) Palit GTX1080ti Super JetStream 11GB
Storage Trandscend 370s 256GB / WD Caviar Black 2+1TB
Display(s) Acer XB270HU 144hz Gsync
Case Phanteks Eclipse P600S White
Audio Device(s) Sound Blaster ZXR
Power Supply Corsair RM1000X 1000W 80 Plus Gold
Cranking up settings is imo not the way it should be tested, because you change conditions. Make textures occupy more memory at same resolutions...

How exactly do we do that?
 
Joined
Aug 22, 2014
Messages
39 (0.01/day)
System Name Bird's Monolith
Processor Intel i7-4770k 4.6 GHz (liquid metal)
Motherboard Asrock Z87 Extreme3
Cooling Noctua NH-D14, Noctua 140mm Case Fans
Memory 16 GB G-Skill Trident-X DDR3 2400 CAS 9
Video Card(s) EVGA 1080ti SC2 Hybrid
Storage 2 TB Mushkin Triactor 3D (RAID 0)
Display(s) Dell S2716DG / Samsung Q80R QLED TV
Case Fractal Design Define R4
Audio Device(s) Audio Engine D1 DAC, A5+ Speakers, SteelSeries Arctis 7
Power Supply Seasonic Platinum 660 W
Mouse SteelSeries Rival
Keyboard SteelSeries Apex
Software Windows 10 Pro x64
How exactly do we do that?

Seriously, I agree, the graphics API and driver are handling all the memory allocations and I'm sure Nvidia is handling in such a way that this handicapped scenario is minimized. I would imagine the only way to expose it is to brute force the memory capacity as in Nvidia's examples.
 
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
you guys are talking like its not both the 970-980. they clearly show both suffer the same performance loss when going over 3.5gb and is as much half in the games shown.

its not fair to people that have bought 2 or more for high res and multiple monitors. what would really make everyone happy again is if they could magically make vram stack in sli.
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
Seriously, I agree, the graphics API and driver are handling all the memory allocations and I'm sure Nvidia is handling in such a way that this handicapped scenario is minimized. I would imagine the only way to expose it is to brute force the memory capacity as in Nvidia's examples.

There example of CoD doesn't mention quality. The only one that does it SoM with less than 3.5GB. CoD:AW At 4k VHQ your already suppose to be at 4GB. BF4 at 4k VHQ uses 2.3GB. Details are lacking.

CoD:AW can allocate 3.2GB of memory usage on a 980 at 1080p VHQ according to GameGPU. Testing at 1440p would have been more convincing provide they show a detail rundown. Not that they need to convince me but not being forth coming is whats got them here.
 
Last edited:
Joined
Aug 22, 2014
Messages
39 (0.01/day)
System Name Bird's Monolith
Processor Intel i7-4770k 4.6 GHz (liquid metal)
Motherboard Asrock Z87 Extreme3
Cooling Noctua NH-D14, Noctua 140mm Case Fans
Memory 16 GB G-Skill Trident-X DDR3 2400 CAS 9
Video Card(s) EVGA 1080ti SC2 Hybrid
Storage 2 TB Mushkin Triactor 3D (RAID 0)
Display(s) Dell S2716DG / Samsung Q80R QLED TV
Case Fractal Design Define R4
Audio Device(s) Audio Engine D1 DAC, A5+ Speakers, SteelSeries Arctis 7
Power Supply Seasonic Platinum 660 W
Mouse SteelSeries Rival
Keyboard SteelSeries Apex
Software Windows 10 Pro x64
There example of CoD doesn't mention quality. The only one that does it SoM with less than 3.5GB. CoD:AW At 4k VHQ your already suppose to be at 4GB. BF4 at 4k VHQ uses 2.3GB. Details are lacking.

CoD:AW can allocate 3.2GB of memory usage on a 980 at 1080p VHQ according to GameGPU. Testing at 1440p would have been more convincing provide they show a detail rundown. Not that they need to convince me but not being forth coming is whats got them here.

I agree, more thorough testing is required, I was just mainly getting at their initial reasoning and simple example made sense to me. I think this will all blow over and people will go, "Ah, yeah, that's hardly a big deal in the big picture, GTX 970 is still a killer deal."
 
Joined
Jul 1, 2011
Messages
336 (0.07/day)
System Name Matar Extreme PC.
Processor Intel Core i9-10900KF @5.2GHZ AVX-1 & Uncore @4.6GHZ @1.285v
Motherboard Gigabyte Z590 UD , With PCIe X1 Card intel killer 1650x card
Cooling CoolerMaster ML240L V2 AIO with MX6
Memory 4x16 64GB DDR4 3600MHZ CL16-19-19-39 G.SKILL Trident Z NEO
Video Card(s) Nvidia ZOTAC RTX 3080 Ti Trinity overclocked +100 core +1000 mem
Storage WD black 512GB Nvme OS + 1TB 970 Nvme Samsung & 4TB WD Blk 256MB cache 7200RPM
Display(s) Lenovo 34" Ultra Wide 3440x1440 144hz 1ms G-Snyc
Case NZXT H510 Black with Cooler Master RGB Fans
Audio Device(s) Internal , EIFER speakers & EasySMX Wireless Gaming Headset
Power Supply Aurora R9 850Watts 80+ Gold, I Modded cables for it.
Mouse Onn RGB Gaming Mouse & Logitech G923 & shifter & E-Break Sim setup.
Keyboard GOFREETECH RGB Gaming Keyboard, & Xbox 1 X Controller
VR HMD Oculus Rift S
Software Windows 10 Home 22H2
Benchmark Scores https://www.youtube.com/user/matttttar/videos
NVidia start offering price cuts or rebates on GTX 970 that should make everybody happy
 

SamuilM

New Member
Joined
Jan 26, 2015
Messages
2 (0.00/day)
These numbers look ok ... but my experience with a 2GB card is that FPS drops only when the game needs significantly more memory than you have . The first thing that happens is actually stuttering . Usually when I play a game that saturates the framebuffer , with say 2.1gb , fps is almost not affected ... but stutters like crazy . When I try to play a game like Shadow of Mordor in Ultra , fps plumets to the low 20 , from about 60 on high ... .
To think that I was actually waiting on a double framebuffer variant of the 970 to upgrade my 2gb 770 . The only reason I whant to upgrade is that I made the mistake of getting a 2gb card instead of 4gb one . The performance is enough for my needs , but the stuttering is not something I can live with . Some more in-depth tests would be nice , I know stuttering can be subjective , but there must be a way to test that aspect of the performance .
 
Joined
Feb 11, 2009
Messages
5,389 (0.98/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> ... nope still the same :'(
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit

HTC

Joined
Apr 1, 2008
Messages
4,601 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
NVidia start offering price cuts or rebates on GTX 970 that should make everybody happy

More testing needs to be done before that.

If this is indeed the result of this card's problem, then no amount of price cuts or rebates can make up for it and a recall SHOULD be made for cards showing this.
 
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
The price will likely drop a bit anyway once AMD actually have a new product to sell, the current fire sale of products in a market already flooded with cheap ex-miners clearly doesn't make much difference.

The card will already be 6+ months old by then too.
 
Joined
Sep 15, 2011
Messages
6,457 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
The issue is only a big deal because it was hidden and can cause issues. The solution should have been just at the beginning to advertise the card as a 3.5gb card and the last .5gb that can cause the issues to be locked out or prioritized for basics that would have just been a nice bonus for people. When you hide a fact from people it tends to make people mad and make things bigger than they might be. Not everyone will be effected by this and many (if not most) who bought the card probably would not care when looking at the card to buy a 3.5gb card versus a 4gb card at the price point and performance shown. Its the people who bought the card specifically for tasks that use that extra .5gb (Like a 4K, surround, or some serious texture packs at high resolutions and FPS) that might see this issue come to play and might have changed their mind about the purchase.

Making this out to be no big deal is how you end up with this type of issue as a continued thing to deal with. Voicing issues and opinions is how you end up with better products that is great for everyone and drives things forward.
 
Joined
Apr 4, 2008
Messages
4,686 (0.80/day)
System Name Obelisc
Processor i7 3770k @ 4.8 GHz
Motherboard Asus P8Z77-V
Cooling H110
Memory 16GB(4x4) @ 2400 MHz 9-11-11-31
Video Card(s) GTX 780 Ti
Storage 850 EVO 1TB, 2x 5TB Toshiba
Case T81
Audio Device(s) X-Fi Titanium HD
Power Supply EVGA 850 T2 80+ TITANIUM
Software Win10 64bit

Easy Rhino

Linux Advocate
Staff member
Joined
Nov 13, 2006
Messages
15,436 (2.43/day)
Location
Mid-Atlantic
System Name Desktop
Processor i5 13600KF
Motherboard AsRock B760M Steel Legend Wifi
Cooling Noctua NH-U9S
Memory 4x 16 Gb Gskill S5 DDR5 @6000
Video Card(s) Gigabyte Gaming OC 6750 XT 12GB
Storage WD_BLACK 4TB SN850x
Display(s) Gigabye M32U
Case Corsair Carbide 400C
Audio Device(s) On Board
Power Supply EVGA Supernova 650 P2
Mouse MX Master 3s
Keyboard Logitech G915 Wireless Clicky
Software The Matrix
Ha! There were a lot of armchair engineers on here yapping about this and that. /rekt

Just because you can Google doesn't mean you have any idea what you are talking about.
 
Joined
Nov 12, 2012
Messages
635 (0.15/day)
Location
Technical Tittery....
System Name "IBT 10x Maximum Stable"
Processor Intel i5 4690K @ 4.6GHz -> 100xx46 - 1.296v
Motherboard MSI MPower Z97
Cooling Corsair H100i + 2x Corsair "HP Edition" SP120's
Memory 4x4GB Corsair Vengence Pro 2400mhz @ 2400MHz 10-11-12-31-1T - 1.66v
Video Card(s) MSI Gaming GTX970 4GB @ 1314 Core/1973 Mem/1515 Boost
Storage Kingston 3K 120GB SSD + Western Digital 'Green' 2TB + Samsung Spinpoint F3 1TB
Display(s) Iiyama Prolite X2377HDS 23" IPS
Case Corsair Carbide 300R
Audio Device(s) Rotel RA-04/Cambridge Audio Azur 540R + B&W DM302/Cerwin Vega AT12 / Sony MDR-XB700 & FiiO E5
Power Supply EVGA NEX650G + Silverstone Extensions
Mouse Always failing me....
Keyboard Endlessly broken.....
Software Windoze 7 Pro 64-bit/Windoze 10 Pro
Benchmark Scores I had some of these once upon a time? Old age has seen me misplace them....
Ha! There were a lot of armchair engineers on here yapping about this and that. /rekt

Just because you can Google doesn't mean you have any idea what you are talking about.

A voice of reason! :toast:
 
Top