• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia GTX 970 problems: This isn't acceptable.

Status
Not open for further replies.
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
I still stand by this would not have been an issue if it was just a mistake in the ROPS and L2 Cache since we have seen performance numbers and those do not lie. The .5gb running extremely slow compared to the rest does pose an issue however and is the primary reason this is a problem. Not every game or benchmark is going to push 3.5gb+ Vram usage easily at least which is where this problem holds its ground as because your expecting 4gb to handle that because you bought it with 4gb. If the problem comes from exceeding 4gb, well that's not a problem for the company as the card came with it and you knew what you were getting so you knew what to expect. However if using that much Vram on the card is causing issues then you were mislead which is why people have a right to be mad about it.

It should not be labeled as the end of the world however I believe people have the right to be mad as I would be in the same situation as many are. Does it make the card a bad card though, no I do not believe so and I still think its a good card even if the specs were revised and they completely dropped the .5gb segment totally and made it a 3.5gb card. It would still be an awesome card, use a decently low amount of power, and give people a good value and still will to those who are keeping theirs/buying 1/buying more.

The thing is, this was seen and viewed by many including reviewers as a great alternative to the GTX 980, 780/780ti, R9 290, and R9 290X. Many people (Including myself) have recommended it as a great value for higher resolutions (1440p, 2160p, etc) and a better alternative in some cases to other card on the market around the same price heck I have seen many people swap out cards for these with the purpose of lower power and the ram upgrade. The reality is this upgrade was not as big as many were expecting and those people are going to feel cheated and reserve the right to be mad/get their money back. Not everyone is forced to feel that way and in no way should ANYONE tell someone else to return their card otherwise their an idiot, but the reverse is also true as you should not say someone is an idiot for returning their card because they feel cheated/are not happy.
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.65/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Yup, ever since Fermi.

But there are some key differences to the way Fermi handled it, Kepler handled it, and Maxwell does today.

Back in the day the Fermi card (550ti) launched with a larger amount of VRAM than was generally the norm (1.5GB where 1GB was the norm for that price bracket). This also meant that games would hardly, if ever utilize the additional VRAM on that card. Basically the problem was there, but it was hidden due to 'more hardware'.

Then Kepler's 192bit bus came around. Kepler was a different beast, but STILL handled VRAM better than Maxwell's 970 does today. On Kepler's 660 and 660ti the 192 bus was divided asymmetrical which meant a 44 Gbps bandwidth dip with the last 0.5GB. This also explains how 660ti was a crippled card on 1080p when you used high levels of AA - similarly priced Radeons would beat the 660ti any day of the week on higher AA settings. The problem was again present, but hidden, because the same happened with the 680 and 256bit versus the 7970 with 384bit - but the origin of the problem is different. Where 680 was just choking because of overall bandwidth because it had a better GK104 behind it, 660ti was choking because of that lacking bandwidth on the last 0.5gb.

Now, compare Kepler and Maxwell and their bandwidth drop on the last 0.5GB. Very much apples & apples. The (much weaker) Kepler card has a 44Gbps bandwidth on the last 0.5Gb. The (over 40% faster) GPU in GM204 has that last 0.5Gb on a measly 28Gbps. Compare this to your DDR3 RAM which generally runs at 16Gbps - that is awfully close. This explains why people run into issues today, while they did not run into issues as much with Kepler or Fermi. The GAP between bandwidth variance is larger than ever before, while at the same time the memory requirements have vastly increased over time *and* the GPU itself is much more powerful.

ENBSeries' comparison of actually having only 3.5GB VRAM effective is bang on the money. Whatever part of the memory Windows resides in does not even come into play: the last 0.5GB effectively handles precisely like system RAM.

Now tell us again this is business as usual... It most certainly is not. To everyone marginalizing this issue, please go home, you haven't been paying attention. Maxwell is the iteration of GPU from Nvidia where cost cutting and memory subsystem tricks have gone too far and result in bad gaming - something which just so happens to be Nvidia's core business for this market. It is absolutely vital that we as (potential) customers draw the line here. If not for your currently owned 970, then at least do it for future generations of GPU.

Well said. Especially that last line of your post. It didn't happen with PC games and now look at some of the buggy, unfinished, chopped down games and DLC being sold later messes that some games are. People bitched about the games for a little while and then turned around and pre-ordered a game from the same publisher later.

I don't intend to return my 970 even if given the option. I'm pleased with it's performance but I certainly support anyone that does want a refund. For those of you that do want a refund take action. Fill out the petition. Send Nvidia an email. Contact the Better Business Bureau (if in USA).
 
Joined
Sep 17, 2014
Messages
20,906 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Yeah, the painful difference between games and GPU is that for gaming, there is a great indie scene that is working steadily towards breaking that overpriced triple-A large publisher dominance, and succeeding at it (look at sales figures for games like Divinity:OS and other good indies).

But do we see ourselves crowdfunding a new GPU company? Hmm.... Maybe we will get to crowdfund AMD someday, who knows :)
 
Joined
Aug 11, 2011
Messages
4,355 (0.94/day)
Location
Mexico
System Name Dell-y Driver
Processor Core i5-10400
Motherboard Asrock H410M-HVS
Cooling Intel 95w stock cooler
Memory 2x8 A-DATA 2999Mhz DDR4
Video Card(s) UHD 630
Storage 1TB WD Green M.2 - 4TB Seagate Barracuda
Display(s) Asus PA248 1920x1200 IPS
Case Dell Vostro 270S case
Audio Device(s) Onboard
Power Supply Dell 220w
Software Windows 10 64bit
I can probably fudge that together somehow this weekend. Not used DSR before, so I'm unsure how accurately I can force 3.4GB usage and 3.8GB usage. Starpoint Gemini II uses 3.4GB at maximum settings on 1440p, so that will be a useful example.

DSR wouln't be a good way to test since it would put the bottleneck on the pixel fillrate. Many users that were "testing" just ran DSR at insane resolutions and blamed the (expected) fps drop on the 0.5GB partition. Also, as mentioned by actual owners that experienced this, the average frame rates don't suffer so automated suites won't show up anything, reviewers would have had to stand in front of the monitor for the duration of the benchmark to notice the stuttering.
 
Joined
Feb 14, 2012
Messages
2,323 (0.52/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
Frankly the whole 9 series has been nothing but trouble for me so far.
I have had problems since day 1 with them.
blackscreens, freezes, Display detection timeouts, device driver hanging.

Bad card. This is what happens when there are no ref cards, and all are OC'd from manuf by default. Lowering clocks / upping voltage solves the crashing, TDRs, etc. This is is well known for the 970 already (shades of 560 fiasco). The 3.5GB / 56 rop / 224-bit thing is just salt on the wound.
 
Joined
May 18, 2005
Messages
65 (0.01/day)
DSR wouln't be a good way to test since it would put the bottleneck on the pixel fillrate. Many users that were "testing" just ran DSR at insane resolutions and blamed the (expected) fps drop on the 0.5GB partition. Also, as mentioned by actual owners that experienced this, the average frame rates don't suffer so automated suites won't show up anything, reviewers would have had to stand in front of the monitor for the duration of the benchmark to notice the stuttering.

The problem isn't just that the framerates go down more than they should, but primarily the huge stuttering and frametime inconsistencies additionally. DSR is both a standard feature on these cards, and a 100% accurate way to test this issue. In reviewers' cases they usually don't, as you said, actually play/sit in front of the computer while their testing is done but rather just let it do its thing while they are not monitoring it, just taking the data after. Add in that the vast majority of reviews didn't push high resolutions and settings, and had no reason to expect there be an issue doing so because of NVidia's false advertising, and yes a lot of reviews were positive since it appeared at first glance and based on NVidia's marketing that it was a great product.

Turns out, once we owners got the cards into our systems... it simply wasn't accurate in actual usage, as my testing in the original post (1st post of this thread) shows, there are immense inconsistencies as a result in the frame timing and this creates an unplayable, juddery, and hitching gameplay experience as a result with the GTX 970 cards in these situations.

Regarding false advertising and nvidia's wrongly published specs:
GoldenTiger said:
Nvidia provides review samples to these sites (linking them here as approved reviews http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/reviews ) and asks them to pass spec and performance info on to customers on their behalf. Additionally they falsely advertised a full 4GB of high-speed VRAM, which they have now disclosed after being called on it, was simply not true. Add in the impact it has for gaming for many such as myself as a 2x GTX 970 SLI user in addition to the false advertising (which, while I'm not a lawyer, I'm pretty sure is illegal, no?) and they need to take responsibility now, then provide an acceptable and equitable remedy for GTX 970 owners to be in the same or better position as they were prior to buying based on false product information.

Whether this includes a full refund including pre-paid shipping costs back to the vendors, an upgrade to the GTX 980 (which would make sense considering there's no other product they manufacture that is similar but for the GTX 980 which is almost the same card bar the disabled areas that caused this whole issue and thus a refund would leave us out of cards period), etc. needs to be determined and actioned upon promptly.

Now that we have the false advertising part cleared up, let's move on to the performance and impact it has in a practical manner, as well, once more (copying the testing from my original post here):


Full-size: http://i.imgur.com/cLUvSNO.jpg


So please, stop doing what this guy is doing below:


They falsely advertised the number of rops, l2 cache, and the fully accessible memory capacity at advertised speeds. Only 3.5gb of the 4gb physically installed is capable of providing the proper performance levels, and dipping into that other 0.5gb segment due to their purposefully-flawed-by-design memory segmentation results in major gameplay issues as I have shown with my testing in my SLI GTX 970 configuration.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.59/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Deceptive Tactics of Nvidia...
 
Joined
Aug 11, 2011
Messages
4,355 (0.94/day)
Location
Mexico
System Name Dell-y Driver
Processor Core i5-10400
Motherboard Asrock H410M-HVS
Cooling Intel 95w stock cooler
Memory 2x8 A-DATA 2999Mhz DDR4
Video Card(s) UHD 630
Storage 1TB WD Green M.2 - 4TB Seagate Barracuda
Display(s) Asus PA248 1920x1200 IPS
Case Dell Vostro 270S case
Audio Device(s) Onboard
Power Supply Dell 220w
Software Windows 10 64bit
So please, stop doing what this guy is doing below:

Dude, I'm like agreeing with you. I was just telling Rcoon to not use DSR since in that scenario the pixel fillrate could become the bottleneck. I also told him that average frame rates don't go down but you get micro stutter and an automated benchmark suite will never detect that. You must be present during the benchmark to catch it.
 
Joined
Aug 20, 2007
Messages
20,767 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
Yup, ever since Fermi.

But there are some key differences to the way Fermi handled it, Kepler handled it, and Maxwell does today.

Back in the day the Fermi card (550ti) launched with a larger amount of VRAM than was generally the norm (1.5GB where 1GB was the norm for that price bracket). This also meant that games would hardly, if ever utilize the additional VRAM on that card. Basically the problem was there, but it was hidden due to 'more hardware'.

Then Kepler's 192bit bus came around. Kepler was a different beast, but STILL handled VRAM better than Maxwell's 970 does today. On Kepler's 660 and 660ti the 192 bus was divided asymmetrical which meant a 44 Gbps bandwidth dip with the last 0.5GB. This also explains how 660ti was a crippled card on 1080p when you used high levels of AA - similarly priced Radeons would beat the 660ti any day of the week on higher AA settings. The problem was again present, but hidden, because the same happened with the 680 and 256bit versus the 7970 with 384bit - but the origin of the problem is different. Where 680 was just choking because of overall bandwidth because it had a better GK104 behind it, 660ti was choking because of that lacking bandwidth on the last 0.5gb.

Now, compare Kepler and Maxwell and their bandwidth drop on the last 0.5GB. Very much apples & apples. The (much weaker) Kepler card has a 44Gbps bandwidth on the last 0.5Gb. The (over 40% faster) GPU in GM204 has that last 0.5Gb on a measly 28Gbps. Compare this to your DDR3 RAM which generally runs at 16Gbps - that is awfully close. This explains why people run into issues today, while they did not run into issues as much with Kepler or Fermi. The GAP between bandwidth variance is larger than ever before, while at the same time the memory requirements have vastly increased over time *and* the GPU itself is much more powerful.

ENBSeries' comparison of actually having only 3.5GB VRAM effective is bang on the money. Whatever part of the memory Windows resides in does not even come into play: the last 0.5GB effectively handles precisely like system RAM.

Now tell us again this is business as usual... It most certainly is not. To everyone marginalizing this issue, please go home, you haven't been paying attention. Maxwell is the iteration of GPU from Nvidia where cost cutting and memory subsystem tricks have gone too far and result in bad gaming - something which just so happens to be Nvidia's core business for this market. It is absolutely vital that we as (potential) customers draw the line here. If not for your currently owned 970, then at least do it for future generations of GPU.

Thanks for the detailed explanation! I agree now with this being explained properly that this practice has gone too far. I now recant my previous statements defending NVIDIA as I now understand the issue better. I agree with your assessment, we need to stop this here. Build a card right for it's price range and modern gaming, or go home.
 
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
Maxwell is a good architecture, no one can deny that, but we also can't deny that NVIDIA fucked it up with moronic design decision on GTX 970...
 
Joined
Sep 25, 2014
Messages
498 (0.14/day)
Location
Lagos, Nigeria
System Name purplekaycee
Processor Core i7 6700
Motherboard Msi z270 titanium
Cooling Cooler Master hyper 212 EVO
Memory 16gb (2x8gb) corsair LPX vengeance
Video Card(s) Gigabyte gtx 1080ti
Storage 1tb Samsung SSD, 4tb Seagate HDD, 8tb Seagate HDD
Display(s) Dell 23" ST 2310 60hz 1080p monitor
Case Cooler Master H500M
Audio Device(s) Astro A40 headphone
Power Supply Seasonic Focus GX 850. 850W
Mouse Regular mouse
Keyboard Regular keyboard
Software Windows 10 64 bit
You forgot to add "for now". Talk to us in 6 months time with new games... If people are already experiencing issues in a small scale, what do you think it will happen when more demanding games arrive? That things will somehow magically improve? Riiiiight...
so what card would you recomend?
which do you use ?coz am about to buy one.

You forgot to add "for now". Talk to us in 6 months time with new games... If people are already experiencing issues in a small scale, what do you think it will happen when more demanding games arrive? That things will somehow magically improve? Riiiiight...
a card that would play all games now and for 12 to 18 months to come .am sick of getting lag frames and crashes when i purchase new games
 
Last edited by a moderator:
Joined
Nov 18, 2011
Messages
245 (0.05/day)
a card that would play all games now and for 12 to 18 months to come .am sick of getting lag frames and crashes when i purchase new games

None honestly, doesn't matter if you buy a 290/290x, or a 970/980. Either of these sets of cards 12-18 months from now will run out of shader grunt likely before VRAM is the primary issue. A lot of games lately have been showing their marginal visual improvements while crashing down another 50% of our framerates at the same time, if you're always wanting to have stable 60FPS+ with placebo Ultras (I don't believe in Ultras on 95% of games on the market, have compared, almost never look better than Highs while consuming another gigabyte of ram.) You're probably going to need SLI 980s/Crossfire 290x. The smarter idea would likely be wait for the new AMD line up and hope they don't list 500w+ and 105 celsius as a feature this time around. They should likely have a better amount of grunt for 1440p+ resolutions though, that's the next step in architecture for both sides.
 
Joined
Nov 30, 2006
Messages
1,002 (0.16/day)
Location
NorCal
System Name Modest Box
Processor i5-4690K @ 4.7 Ghz
Motherboard ASUS Z97-C
Cooling Noctua NH-D15
Memory G.Skill Ares DDR3-2400 16GB
Video Card(s) Colorful GTX 950
Storage OCZ Vertex 460A 480GB
Display(s) HP w2558hc
Case Cooler Master Stacker 830
Audio Device(s) Onboard Realtek
Power Supply Gigabyte 750W Gold
Mouse Microsoft Intellimouse Explorer
Software Windows 10 64 Bit
Correct, only the 3GB version is symetrical. However, when it was released, 2GB was alot of VRAM requirement. Who noticed, really? Now a single 660, even a 3Gig version, is not going to play the newest games at more than medium settings, the GPU chip itself can't do it, and a couple years old or older games, on medium-high, it's likely still to not be an issue.

Besides, IIRC, those that did discover this didn't show it as a problem, just that performance was better on the 3GB versions because it performed faster.

Same situation with the GTX 550 Ti, GTX 460 V2 (192 bit with 1GB VRAM), GTX 660 Ti, GTX 560 SE. Its not the first time NV has used this kind of arrangement and Anandtech reported on it at length. I think was made it easier to swallow was the obvious lack of symmetry between the 192 bit bus width and the 1GB or 2GB of memory. In the case of the GTX 970 it seemed symmetrical from the 256 bit/4GB specs.
 
Joined
Feb 1, 2013
Messages
1,248 (0.30/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
Same situation with the GTX 550 Ti, GTX 460 V2 (192 bit with 1GB VRAM), GTX 660 Ti, GTX 560 SE. Its not the first time NV has used this kind of arrangement and Anandtech reported on it at length. I think was made it easier to swallow was the obvious lack of symmetry between the 192 bit bus width and the 1GB or 2GB of memory. In the case of the GTX 970 it seemed symmetrical from the 256 bit/4GB specs.

Please add the 660 (non-TI) 2GB to that list as well. That version has the same exact memory subsytem as the 660 Ti, albeit it was based on a different SKU for the lower shader count. Many of the defenders need to start seeing that this is a trending behavior from NVidia in order to provide cheaper products, but their marketing is probably deliberately not being kept up to date.

@Vayra86 -- Thanks so much for taking the time to paint the scope of this problem to people.
 
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
so what card would you recomend?
which do you use ?coz am about to buy one.

R9-290X or GTX 980. But personally, I'll wait for R9-380X. It's so close it would be stupid to buy something in rush and regret it later. Wait these 2 more months imo. That's what I'll do...
 
Joined
Jun 13, 2012
Messages
1,326 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
I guess should go without saying that how the starter of this thread did his testing is pretty much unrealistic for 99.9999% of people to use those settings to get those results. I would bet even radeon cards would have a hard time running at game at 4k+ rez with 4x MSAA as well not after stuttering issues when you turn.

I know people will flame the hell outta me for speaking of that Fact.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
I guess should go without saying that how the starter of this thread did his testing is pretty much unrealistic for 99.9999% of people to use those settings to get those results. I would bet even radeon cards would have a hard time running at game at 4k+ rez with 4x MSAA as well not after stuttering issues when you turn.

I know people will flame the hell outta me for speaking of that Fact.
You're making a supposition not stating a fact, so your argument doesn't hold.

Also, when testing something, the whole point is to stress it to reveal weaknesses and find the limits of performance, so the fact that the product may not be used that way for most of the time is irrelevant.
 
Last edited:

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
The most amusing crime here is the fact I have one, thousands of people (some who don't even own one) are complaining about this very clear and factually represented defect, returning their cards for some obscure purpose, and yet for some reason I don't care in the slightest.

I know that my card is crippled, I know that if I spontaneously decide to use DSR to go from 1440p up to ultra mega super HD and then scale it down again that my card will literally soil itself and make a smutty stuttery gooey mess inside its poor little gimped memory units.

And yet, still, for some unknown reason, I don't care.

My processor looks down from its lofty 1150 socket on high, stares down at the 970 below and teases it for all it's disabled parts, fully functioning but worthless parts. Sometimes I try to calm him down, but then some random internet person who's probably not evening running a dedicated GPU of any kind signs up to some forum and starts laughing at how all these 970's are incapable of running at ultra mega big-baller HD through the use of DSR, doesn't bother looking at graphs and tests done to make comparative performance figures and actual performance, but carries on making multiple threads about something which has news articles with comments sections and already existing 100 page threads on countless forums.

Then I realise, I don't use DSR, I'm not buying 4K any time soon, have yet to pass a 4GB requirement, and my card was as cheap as dirt. If it was sold with a puzzling 3.5GB of VRAM, I'd still buy it. Maybe I'm just a terrible idea of a customer, maybe I should be more self entitled, and perhaps I should take up arms with the militia. But frankly, I can't be bothered.

If you don't (or don't plan to) use the the whole bandwidth, then this card is probably the best for you when factoring price / performance because it performs like a monster for it's price.

It's only when you try to fully utilize the bandwidth that you encounter problems: if you use (or plan to) all the bandwidth available (4K resolutions, newer games down the line, whatever else) then it's a whole different ball game.
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,449 (2.38/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Ok, first point, I'm NOT defending Nvidia. They're dicks for doing this frankly and yes, it's made me think about my next purchase (and I'll get either the next top gen AMD or NV card).

But, I've got to get this verified. In the OP this is the results at >3.6Gb usage:

sli enabled very bad frametimes - 40+ peaks over 50ms
sli disabled (i.e. single card) - 9 peaks over 50ms

But then look at this, settings under 3.5Gb:

sli enabled - 10 peaks above 50ms
sli disabled - <9 peaks above 50 ms (and better spaced out).

This graph only shows that above 3.6Gb usage, sli doesn't work. The frametimes for >3.6Gb and under 3.5 Gb have the same number of spikes above 50 ms. IN FACT, only the sli setting at 3.6+ looks terrible. single card at 3.6gb has the same frame times as single card under 3.5Gb.

Look at the lines, his dark blue line is the most stable of the lot (hovers at 30ms, with the 9 peaks) and that is single card at 3.6+. It looks better than the single card at :love:.5gb.

Can we have some rationalism here? All that graph shows is that sli at 3.6gb+ is borked. Single card plays fine with expected (but not awful) fps.

Really, everyone - look at that graph again for SoM - dark blue line is better than olive coloured line (3.6gb usage is better than :love:.5gb usage).

sli disabled over 3.6Gb Vram usage - blue line median and peaks at or over 50ms




sli disabled under 3.5Gb Vram usage - olive line median and peaks at or over 50ms




I'm sure there are problems but this clearly demonstrates that at over 3.6 and under 3.5 there isn't that much of a difference. ONLY sli seems screwed up.

Also, compare with this:



It's clear at 4k ULTRA on SoM (i'm sure it using more than 3.5Gb Vram here?) 970 fares worse but nothing awful.....

From the OP's own work and this graph from PC Per (using the worlds most ram heavy game, at ultra 4k settings), it's not very clear the issue being discussed is causing the problems.

Can this be discussed sensibly? I should add I'm currently having sex with JSH's daughter (unless she's under 16 - UK law - in which case I'm having sex with him, so I'm obviously not biased.).
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
I believe that's part of the issue. The card it self be it by driver forces its way back to 3.5 and does its best not to spill over to the 0.5 segment.



That's why there people saying they don't notice much or say they are using all 4GB. Some of those burn in test posted have initial 4gb memory usage but in just the brief clips that get posted on YouTube you see them fall back to 3.5gb



There is clearly something going on an it might be game texture setting oriented. A game that utilizes a lot of ram even at 1080 that going to higher res like 1440 will exhibit the issue more then a game that uses very low ram and upping the resolution does little for ram usage so it needs to be pushed to 4k. Which makes no sense because your crippled there anyways on a single. Makes sense he is using SLI at 4k.

Also PCPerspective might be using a different version then the one GT is using at different settings.




Oh and virtual bro fist bump for doing JHH daughter or him but that just means your bi not bias.
 
Last edited:

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,449 (2.38/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
It's obvious there is a problem but it's not so much with the card but what it's been sold as. If it was sold as 3.5Gb memory then it'd be a non issue. It is as you allude to the fact that it's being bought for 4k. To be honest though, I think if you go for a full fat 4k monitor you really shouldn't scrimp on graphics. For 4k right now, I'd be buying dual 8Gb 290X's or Titan Blacks.

No matter what, Nvidia will have to do some form of damage control, they'll be trying to figure out the best way without losing face and business to AMD. Going against NV, AMD have XDMA working near perfection (when it's optimised) but NV can (and have clearly demonstrated) hobble AAA titles with Gameworks. Though looking at AC: Unity, christ, what a mess.

Roll on GM200 and Sea Islands (or whatever it's called).
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
@the54thvoid The best way NVIDIA can avoid losing business to AMD is to offer a free replacement to a GTX 980 and just take the hit, it really won't break the bank. They also need to publically apologize for this gaff unreservedly. Finally, they need to lower the price of the 970 and describe it accurately on their websites and on product boxes, or just discontinue it and make a similar product to replace it at the new, lower, price point. Somehow I think this company is far too arrogant to do any of this, however.

If AMD eat their lunch, they deserve it and I'm speaking as a pissed off NVIDIA owner not an AMD fanboy, since dodgy tactics like this never help the customer, whichever brand you prefer.
 
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
@the54thvoid The best way NVIDIA can avoid losing business to AMD is to offer a free replacement to a GTX 980 and just take the hit, it really won't break the bank. They also need to publically apologize for this gaff unreservedly. Finally, they need to lower the price of the 970 and describe it accurately on their websites and on product boxes, or just discontinue it and make a similar product to replace it at the new, lower, price point. Somehow I think this company is far too arrogant to do any of this, however.

If AMD eat their lunch, they deserve it and I'm speaking as a pissed off NVIDIA owner not an AMD fanboy, since dodgy tactics like this never help the customer, whichever brand you prefer.

The irony is, there are hardly any people with your kind of logic in the GTX 970 case. I'm currently an AMD user and while I often defend AMD when it deserved that, I'm not really a fanboy and will stand on the side of NVIDIA as well when it needs defending. But with so many dumb things they've done in teh past few years, they hardly ever give me that chance, thus the reason why I'm lately more on Radeon cards than on GeForce. Had my share of NVIDIA cards and also AMD/ATi cards and I was happy with all of them. Would be really pissed if I bought GTX 970 a week ago like I originally wanted...

People blame me for being a fanboy and NVIDIA hater and all, but I've actually watched their GTX 900 series launch presentation with huge enthusiasm and had a very positive thoughts on it. And then they destroyed it all with such under the hood coverups and chip hacks in order to make it look better than it really is. That's where I draw the line.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Cheers RejZoR. I've just realized that I may have implied that all AMD owners are fanboys, which is most certainly not the case. :oops:

Perhaps my sentence should read "...I'm speaking as a pissed off NVIDIA owner not an AMD owner or fanboy..." to make it clear. :)

I also don't see a problem with denying a company my money over tactics like these even if I think on the whole their products are better than the competition.
 
Joined
Feb 20, 2007
Messages
372 (0.06/day)
Location
Where the beer is good
System Name Karl Arsch v. u. z. Abgewischt
Processor i5 3770K @5GHz delided
Motherboard ASRock Z77 Professional
Cooling Arctic Liquid Freezer 240
Memory 4x 4GB 1866 MHz DDR3
Video Card(s) GTX 970
Storage Samsung 830 - 512GB; 2x 2TB WD Blue
Display(s) Samsung T240 1920x1200
Case Bitfenix Shinobie XL
Audio Device(s) onboard
Power Supply Cougar G600
Mouse Logitech G500
Keyboard CMStorm Ultimate QuickFire (CherryMX Brown)
Software Win7 Pro 64bit
Status
Not open for further replies.
Top