• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface

Some people think that nVidia's business practices set a record low mark AND openly state that..
You consider them to be "fanbois" and "irrational". Well. So what?

Maybe that's where the problem is? Why should you care about "bitching"? Why not just get over it?


Bingo. Most people judge GPU performance based on reviews, not real life experience.

Yep. It's damn subjective. Most things expressed in perf bars are hardly noticeable in real life, that's why we have things like "halo product", more 3xx's being sold after Fiji was released etc.

And your point was?

Pretty sure my point was pretty clear. He was calling the GTX970 garbage according to some information that was learned after release and the cards performance had been advertised. And the card's performance was the same before we knew this information, and afterwards. Basically nullifying his point about the card being garbage because of the way the memory subsystem is set up.
 
...card's performance was the same before we knew this information, and afterwards.

I understand your POV. Now please try to understand the other .

Imagine that you buy a car that is advertised to have 4 wheel drive and later discover, that it's only, lol, 3. (somehow) Would your car become any worse because of it? Nope. Did manufacturer save costs by not doing? Yep. Did he lie to you? Yep. That's the whole point.
 
Asinine analogy.

I see both sides of this. In the end A LOT of people made more out of this than needed to be (and seemingly are still at it...). Performance never changed, people rarely hit the slowdown, there is actually 4GB, not 3.5GB, it's just slower, that last 512MB. Nobody shit themselves when they did this on the 660ti... but now it's a big deal... MEH.
 
I understand your POV. Now please try to understand the other .

Imagine that you buy a car that is advertised to have 4 wheel drive and later discover, that it's only, lol, 3. (somehow) Would your car become any worse because of it? Nope. Did manufacturer save costs by not doing? Yep. Did he lie to you? Yep. That's the whole point.

That's completely different. In you're analogy, there is false advertising at play. nVidia advertised the card as having 4GB, because that is what it physically has, and can physically address it. How it addresses the memory was never spoken of.
 
I understand your POV. Now please try to understand the other .

Imagine that you buy a car that is advertised to have 4 wheel drive and later discover, that it's only, lol, 3. (somehow) Would your car become any worse because of it? Nope. Did manufacturer save costs by not doing? Yep. Did he lie to you? Yep. That's the whole point.

I respect your viewpoint, however: The number of people who bought 970's in the last 13 months AFTER the information came out, and were aware of the issue, and bought anyway, FAR outnumbers the number sold before the issue became public.

So, they knew what they were buying, no lies told to those after. In short, it really isn't an issue for most people, and user reviews, which indicate the buyer knew of the issue before purchase time on multiple sites back this up.
 
The Problem with the GTX 970 Memory is:

a) there are special profiles needed for some games
b) nVidia has the power to make this GPU useless if they want to.

WIth a normal memory architecture, that's not that easy and you can still use your graphics card.
But with the Memory Architecture of the GTX 970, nVidia is able to make this chip useless, more or less with the flip of a switch.

And have you guys learned nothing of the Witcher 3??
Yes, that game, where a GTX 780/Titan was beaten by a ~200€ Card like the GTX 960.
This happened with a couple of other games, like Batman...
Thus you can use this as confirmation that nVidia drops support on older hardware easily (well, at least the optimisation stuff).

What do you think will happen when there is a successor to the GTX 970??
Do you really think that nVidia will still optimize for a not manufactured/sold card??
And they also could use the memory archtecture to cripple this card, so that most newer games are unplayable -> most GTX 970 users will run to the next store and buy the next Geforce card...
 
And have you guys learned nothing of the Witcher 3??
Yes, that game, where a GTX 780/Titan was beaten by a ~200€ Card like the GTX 960.
This happened with a couple of other games, like Batman...
Thus you can use this as confirmation that nVidia drops support on older hardware easily (well, at least the optimisation stuff).

Honestly, you've been reading too many media reports instead of trying things yourself. This is where real world experience counts for more.

I had tbe 780, and used it to play TW3 the first two times I played the game. Most who know me know I am an image quality over frame rate person. I used the 780 to render almost all high to very high settings, except shadows on medium, and no hairworks. I still played at a consistent 50 to 60 fps.

Also owning a 960, it's almost embarrassing for the 960 how badly it gets beaten. For it to put out a mere 30 to 35 fps, most settings MUST be dropped way down to medium. Think about that...same resolution, settings that don't produce as good visuals as the 780 had, and it is only JUST playable on a 960.

In conclusion, the only thing a 960 wins at versus a 780 in TW3 is the title of Lower Performing Card.
 
Last edited:
still rocking the 970 here. I dun care what ppl say about why it has 3.5+0.5 instead of full 4. As long u don't run demanding games with hi-res texture mods with graphic settings tweaked to Very High or Ultra for the sake of eye-candy, you'll be perfectly fine hitting well over 50fps on average for most of the time with slight dip to 40fps.
 
@rtwjunkie the 960 is a mid range card, so no surprise here since it can't outpaced it's older brother; the 780. With 2G on most models & a few that has 4, it's not meant to win fps contests but it's good enough for those who are on a tight budget & wants a 1080p ready card that handles most games well.
 
Darn. I'm ready to throw down some money on a Pascal, but I'm gonna have to wait longer...

same here :),

hopefully it'll be cheaper on the electric bill than the hd5870 i have at the mo
 
I'm seeing this full GP104 chip ~7tflops area at best, this is just around 1200-1300mhz boost on 980ti, + a lot of marketing crap with lower power again;

Imo for someone at high-end Mawell GM200 chip a GP104 gtx1080 is not really a worthy upgrade,

I would maybe change my new 980Ti for a full GP100 chip, but I don't see the point at 1080p, well unless its literally 2x faster..
Think I'll just stick with my original plan to wait for Volta and AMD offering in Q1 y2018?.. :)
 
Back
Top