• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Nvidia GTX 970 problems: This isn't acceptable.

Status
Not open for further replies.
I disagree. The design is very flawed. There is no good reason to have 28 GB/s bandwidth and XOR contention in a 2015 card. Half the VRAM performance of a 2007 midrange card plus XOR contention? How is that anything close to a "shitload of performance" and a "win-win"?

It isn't. It was a very stupid decision to market the card as a 4 GB card. Nvidia should have just disabled that .5 GB altogether. It did not. It fooled people into thinking they were getting a 4 GB card. It seems to be a clear case of bait and switch fraud. And, even ignoring the bad business practice altogether, there is no good reason for the design. The only thing close is a business justification based on fooling people into thinking they're getting a 4 GB card.

People need to stop praising them for a combination of deplorable business practice and highly flawed design.
Duh! Well said.

Even if the card works well enough in most scenarios, that's not the point. I just don't get why people become NVIDIA apologists when they pull stunts like this. :shadedshu: Note that I only buy NVIDIA cards nowadays, so I'm not talking from a fanboy perspective, but a rather pissed off customer who has lost some trust in the brand.
 
I disagree. The design is very flawed.
The chip is designed this way, every "design flaw" is made on purpose as a tradeoff. Nvidia would be much happier if they didn't have to cut off as much to have good yields.
There is no good reason to have 28 GB/s bandwidth and XOR contention in a 2015 card.
Having greater die harvesting yields to compete with price is a perfectly fine reason.
Half the VRAM performance of a 2007 midrange card
What midrange card from 2007 has 400 GBps memory bandwidth?
It was a very stupid decision to market the card as a 4 GB card.
I agree as this was my main point. So much stupid that it's almost plausible it might have been an unintentional cockup.
Nvidia should have just disabled that .5 GB altogether.
It should be considered disabled (as it doesn't get used before texture memory spills out) and considered as a buffer - you can be sure, as an alternative, accessing RAM through PCIE is worse becase of latency (despite XOR contention)
 
Duh! Well said.

Even if the card works well enough in most scenarios, that's not the point. I just don't get why people become NVIDIA apologists when they pull stunts like this. :shadedshu: Note that I only buy NVIDIA cards nowadays, so I'm not talking from a fanboy perspective, but a rather pissed off customer who has lost some trust in the brand.
I agree, its pretty bad when people defend NVidia doing this like they are our savior or something. Certain trolls think this is ok as long as the card performs decently well in enough scenarios to be ignored which to me ruins the point of a card lasting a decent amount of time and damages scenarios the card would make use of that (Like SLI with DSR, 4K, modded games like Skyrim, etc). Whether or not this effects every person is not the point, it needs to be stopped and advertised correctly to prevent something similar like this from happening in the future to improve everyone's experience.

No one should be or is forced to get rid of the card if they like it and works for them. But voicing these problems will only help build the market and help us get better products in the future (Or at least more of an understanding what is being bought).
 
Jeez.

Nobody is defending Nvidia over this. Get over it ffs! People are simply stating in the majority of cases, the card works well for them.
What NV did is frankly either (a) deplorable or (b) inexcusable. One is intentional, one is gross negligent oversight.
Its amazing how people keep twisting other peoples words.
Card good, company bad. Don't like your card? Send it back, don't buy NV again. Stop crying.
 
Card good, company bad.

3562791-3536500-2490668-hulk_facepalm.jpg

Hulk no understand, head hurt
 
I just swapped my MSI 970 Gaming for a MSI 290x Gaming, performance wise I can't notice any difference, heat is only about 5 to 10c hotter and the build quality is a massive improvement.

I had to use a few zip-ties to hook up the rear of my 970 as it sagged very badly.
Necro but... I cant help it...

Temperature and heat are different things!!!

For example the yellow flame in a lighter and the yellow flame in a bonfire are the same temperature. However, it should be clear that there is more energy/heat coming out of the much larger bonfire even though the temperature is the same.

The 970 is a 150W card vs the 290 @ 250W. ;)
 
Necro but... I cant help it...

Temperature and heat are different things!!!

For example the yellow flame in a lighter and the yellow flame in a bonfire are the same temperature. However, it should be clear that there is more energy/heat coming out of the much larger bonfire even though the temperature is the same.

The 970 is a 150W card vs the 290 @ 250W. ;)

290 is pretty close to 300watts just like the x variant.
 
290 is pretty close to 300watts just like the x variant.

O please this shit don't even matter if you use vsync at 60Hz, secondly you can down clock the card to 724 \ 1250 along with core mV -30 + power limit 30% and end up cutting off around 30-120w ( depends on game ) and still have most games running 60 fps with some drops down to 35fps.
 
O please this shit don't even matter if you use vsync at 60Hz, secondly you can down clock the card to 724 \ 1250 along with core mV -30 + power limit 30% and end up cutting off around 30-120w ( depends on game ) and still have most games running 60 fps with some drops down to 35fps.

If you have a 60Hz monitor it will rarely be pulling that much anyway. Had an XFX DD which ran at stock (reference) speeds and it wasn't too bad except in something like Watch Dogs.
 
O please this shit don't even matter if you use vsync at 60Hz, secondly you can down clock the card to 724 \ 1250 along with core mV -30 + power limit 30% and end up cutting off around 30-120w ( depends on game ) and still have most games running 60 fps with some drops down to 35fps.
That is a big if...not everyone uses vsync or wants to (input lag).

And why would I(anyone) get a card to downclock it to save power? Im going to buy a Lamborghini........... but have someone put a governor on it because its too fast??? Whaaaaaaaaaa? :confused:
 
That is a big if...not everyone uses vsync or wants to (input lag).

And why would I(anyone) get a card to downclock it to save power? Im going to buy a Lamborghini........... but have someone put a governor on it because its too fast??? Whaaaaaaaaaa? :confused:

I only require between 30 - 60 fps and slowed the card down as it's pointless going over 60 fps for me which in turn i actually turn vsync off as well.
 
Entirely true that we can't do anything about it. But it still feels good to say that we're warning other people about the setbacks of the GTX 970.

(Don't kill me please, first post on this forum)

Welcome to our site.
 
Jeez.

Nobody is defending Nvidia over this.
Cool story.
Get over it ffs!
No.
People are simply stating
Whenever someone says this in a debate it basically always means people are saying a bunch of stuff.
in the majority of cases, the card works well for them.
Big deal. "In the majority of cases my mother isn't run over by her neighbor."
What NV did is frankly either (a) deplorable or (b) inexcusable. One is intentional, one is gross negligent oversight. Its amazing how people keep twisting other peoples words.
What, like "get over it" which translates into not doing anything further about the continuing problem? Nvidia still lists the 970 as having 4 GB of 224 GB/s VRAM on its website, "FFS"
Don't like your card? Send it back, don't buy NV again. Stop crying.
:rolleyes: Stop polluting the topic with asinine patronizing comments that also happen to be deceptive. People have a lot more options than your prescription entails, nor should anyone stop publicly taking a stand against fraud just because you or anyone else finds it uninteresting or whatever.
 
Cool story.

No.

Whenever someone says this in a debate it basically always means people are saying a bunch of stuff.

Big deal. "In the majority of cases my mother isn't run over by her neighbor."

What, like "get over it" which translates into not doing anything further about the continuing problem? Nvidia still lists the 970 as having 4 GB of 224 GB/s VRAM on its website, "FFS"

:rolleyes: Stop polluting the topic with asinine patronizing comments that also happen to be deceptive. People have a lot more options than your prescription entails, nor should anyone stop publicly taking a stand against fraud just because you or anyone else finds it uninteresting or whatever.


They do have 4GB that's why.
 
Big deal. "In the majority of cases my mother isn't run over by her neighbor."

When was the last time your graphics card ran over your mother? o_O
 
They do have 4GB that's why.
Well, it should really be listed as 3.5GB +512MB as otherwise it's not clear at all that it's segmented and only 3.5GB of that is full speed RAM. This segmentation and slower RAM does impact performance in some situations, so it should be stated up front. AMD's equivalent 4GB card is better in this respect and gives an unfair advantage to NVIDIA to make them look like they're on an equal footing.

It wouldn't surprise me if that upcoming 980 Ti with its crippled GPU is also made like this and again NVIDIA won't disclose this.

Yes, we should never take the pressure off when companies keep on doing deceptive practices like this.
 
But they don't need to it has 4GB, i guess as long as they don't say at full speed or some thing they are ok illegally.

But i agree they should have to say it and maybe they do in some document some were hard to find in super small text and so on lol.
 
My 970 handles games just dandy, hell it often does better against other cards with 4GB which will remain nameless.
 
970 in our house handles everything superbly. Hell, it has yet to even hit 3GB of VRAM usage. Where these cards are positioned is the 2 year upgrade cycle. Sure the .5GB of "slightly slower-but-still-much-faster-than DDR VRAM" may be a real issue for me and other non-SLI'ers in a year and a half, but by then it will be time to get a new card anyway.
 
Last edited:
970 in our house handles everything superbly. Hell, it has yet to even his 3GB of VRAM usage. Where these cards are positioned is the 2 year upgrade cycle. Sure the .5GB of "slightly slower-but-still-much-faster-than DDR VRAM" may be a real issue for me and other non-SLI'ers in a year and a half, but by then it will be time to get a new card anyway.

i could name a few that would.
 
You could have bought a R9 295X2 and wouldn't have this problems lol. Anyway DX12 in near. You can (theoretically speaking) use 2x3.5GB
 
@rtwjunkie I don't know why you can't accept that the GTX 970 is no good for gaming and The Witcher 3 looks like crap and is getting worse with every patch and after 5 patches will look worse than Minecraft. I think you're one of those stubborn types that wants to use your own eyes and experience rather than listen to the opinions of people that don't own the game or the card. :p
 
@rtwjunkie I don't know why you can't accept that the GTX 970 is no good for gaming and The Witcher 3 looks like crap and is getting worse with every patch and after 5 patches will look worse than Minecraft. I think you're one of those stubborn types that wants to use your own eyes and experience rather than listen to the opinions of people that don't own the game or the card. :p

LMAO!! :laugh: You are SUCH a trolling instigator! Love it!
 
They do have 4GB that's why.
Keep repeating a lie and it's still a lie.

It's like putting 4 GB of VRAM onto a card where 2 GB isn't connected to the board and wouldn't work at all even if it had been. "It's got 4 GB of VRAM!"

That's called a lie of omission. The card has 4 GB of VRAM but not 4 GB of 224 GB/s VRAM. Hence, the lie of omission.

The 3.5 GB partition maxes out at 196 GB/s and the .5 GB partition maxes out at 28 GB/s. The only way 224 GB/s would normally be possible is when both partitions are active but it's not actually possible to attain that speed because of the extreme slowness of the 28 GB/s partition and its XOR contention. As Anandtech's article said, the more the slow partition is utilized, the more it slows down the other partition.
 
Last edited:
Status
Not open for further replies.
Back
Top