Wednesday, January 28th 2015

NVIDIA to Tune GTX 970 Resource Allocation with Driver Update

NVIDIA plans to release a fix for the GeForce GTX 970 memory allocation issue. In an informal statement to users of the GeForce Forums, an NVIDIA employee said that the company is working on a driver update that "will tune what's allocated where in memory to further improve performance." The employee also stressed that the GTX 970 is still the best performing graphics card at its price-point, and if current owners are not satisfied with their purchase, they should return it for a refund or exchange.
Source: GeForce Forums
Add your own comment

89 Comments on NVIDIA to Tune GTX 970 Resource Allocation with Driver Update

#27
xfia
seems like he may be onto something.. I mean AMD is going free for freesync and with the practically empty bank account they have. I think if he really hits the code he will come out with something more like freesync and show gsync is a bloated technology made to make them more money when the technology at hand was already capable of producing the same end game. basically what AMD is already showing to be true.

@ryun dont be so fast to dismiss this.. people where complaining about the 970 on the nvidia forum since day 1 with no real answers.
Posted on Reply
#28
john_
If the above is true even partially, it will mean that Freesync/Adaptive Sync produces the same result as GSync for no cost.
Posted on Reply
#29
GhostRyder
Recus

When you say Nvidia should compensate GTX 970 remember that others should do it too.
If your going to make a joke like that you need to at least make it right. There are 8 cores inside an FX 8XXX series and 9XXX series chip, its 4 modules 8 cores because there are 2 cores per module meaning it really is an 8 core.

Also the PS4 and XB0X ONE do have 8gb of memory inside but some is reserved at times for system resources. However recently they have even stated they are/will be allowing more and more to be accessed for games.

Anyway, either way any update will help the GTX 970 but it will never alleviate the root cause completely. It might be better if they find a way to utilize the ~500mb in a different method that will alleviate the main ram for its necessary tasks. Could be a good way to get some extra performance in a way that could make the 3.5gb feel a bit larger and use it effectively though that is just a random thought. Any performance work is good for the owners or for the future so if they can help it at all that is going to be nice.

Also why are we discussing G-Sync vs FreeSync on this thread? I do not see the relevance?
Posted on Reply
#30
64K
GhostRyderAlso why are we discussing G-Sync vs FreeSync on this thread? I do not see the relevance?
This happens on a lot of Nvdia/AMD/Intel threads. People that dislike a particular company and like another company will dredge up anything to dump in that thread. There will probably be people 10 years from now still talking about the 970 misrepresentation.
Posted on Reply
#31
Casecutter
GhostRyderAnyway, either way any update will help the GTX 970 but it will never alleviate the root cause completely.
Yea Nvidia will continue to press those originally tasked to find a "workaround" for the necessitating in "fusing-off" defective/damaged L2 to keep refining this new process feature. This won't be an isolated case going forward, as Nvidia terms it "partial disabling" will propagate into the future. As it was said in btarunr' article yesterday, "This team (PR) was unaware that with "Maxwell," you could segment components previously thought indivisible, or that you could "partial disable" components." Nvidia will need to make such "disablements" hopefully more seamless and unnoticeable with upcoming product releases.

As for them finding any "across the board" FpS performance highly doubtful, other than a few instances, perhaps they'll get it to more usable for those with SLI. Either way it seems Nvidia is sweeping this under the rug.
Posted on Reply
#32
wickedcricket
Rahmat SofyanIt'll be a miracle, these problem can fixed only by an update driver... finger crossed :)

Oh yeah, this just in

Truth about the G-sync Marketing Module (NVIDIA using VESA Adaptive Sync Technology – Freesync)






Source

I wish this wasn't true...

PS : Sorry if OOT :), really bored with 3.5GB hype..
haha read that just a minute ago! :) Well, well what d'ya know! :O
Posted on Reply
#33
alwayssts
I haven't weighed in on this whole thing as I really don't know what to add that's constructive.

I certainly have noticed micro-stutter in Mordor that I associate with this, and it sucks. It's very noticeable, even in benchmarking (where you see a very quick plummet of framerate). It would be great if I could use higher resolution/textures (memory-heavy options) and simply have a lower, if consistent experience....I feel the resolution options are granular enough (say from 2560x1440->2688x1512->etc) the space between 3.5-4 would be useful. I would have also appreciated if the L2C/SYS/XBAR clocks would have been set to GPC clocks out of the box (like previous generations?), as raising those up seemed to help things quite a bit for me. There are simply quite a few nvidia and aib (mostly evga) kerfluffles that really rub me the wrong way about this product since launch, and are only being fixed/addressed now because of greatly-appreciated community digging.

That allll said, and this doesn't excuse it, they are being addressed...and that means something. Yes, nvidia screwed the pooch by not disclosing this information at launch, but not only does this revelation not change the product, it also holds their feet to the fire for optimizations more-so than if it had been disclosed from the start. It also does not change the fact when I run at 2560x1440 (etc), which is really all I need from my viewing distance, I am still getting better performance than any other 9.5'' card on the market. I feel for the price I paid, the performance is fair. Had I paid more than I did, or adversely had they cut the product to 12SMMs/192-bit, I would likely be disappointed. This is obviously a very well thought-out and placed product, and still deserves praise for the weird amalgamation that it is.

Edit: added pic of how I changed l2c etc clocks. I know this is a common mod amongst the community, but am curious if this helped others stabilize things as much as it helped me.

Posted on Reply
#34
RejZoR
Recus

When you say Nvidia should compensate GTX 970 remember that others should do it too.
If you see 8 cores in Windows, any system will detect 8 cores, you'll have 8 physical threads, meaning it is in fact a 8 core CPU and they weren't lying. What's the design behind those 8 threads is irrelevant, because the fact still stands that you have 8 physical threads. Unlike with missing ROP's that are just, missing (or shall I say not functional) PHYSICALLY. Where NVIDIA advertised GTX 970 as 64 ROP card just to be uncovered as a 56 ROP card. Not quite the same aye?

And for the memory capacity. It says 8GB and guess what, PS4 has 8GB of memory. Are you saying it doesn't have it on PCB? Oh wait... No one said GTX 970 doesn't have 4GB of memory. Because we all know that it does. But the way how it's accessing it and utilizing it, that's the shitty part and the reason for the outrage beyond 3,5GB. Get your facts straight man...

Would you mind telling us where did you get the 1.2 billion transistor count? From speculations on desktop Jaguar cores? PS4 ain't desktop system you know, it's custom designed for PS4 specifically and based upon desktop components...
Posted on Reply
#35
GhostRyder
RejZoRIf you see 8 cores in Windows, any system will detect 8 cores, you'll have 8 physical threads, meaning it is in fact a 8 core CPU and they weren't lying. What's the design behind those 8 threads is irrelevant, because the fact still stands that you have 8 physical threads. Unlike with missing ROP's that are just, missing (or shall I say not functional) PHYSICALLY. Where NVIDIA advertised GTX 970 as 64 ROP card just to be uncovered as a 56 ROP card. Not quite the same aye?

And for the memory capacity. It says 8GB and guess what, PS4 has 8GB of memory. Are you saying it doesn't have it on PCB? Oh wait... No one said GTX 970 doesn't have 4GB of memory. Because we all know that it does. But the way how it's accessing it and utilizing it, that's the shitty part and the reason for the outrage beyond 3,5GB. Get your facts straight man...

Would you mind telling us where did you get the 1.2 billion transistor count? From speculations on desktop Jaguar cores? PS4 ain't desktop system you know, it's custom designed for PS4 specifically and based upon desktop components...
The transistor count he's referring to is the bulldozer architecture change from 2 to 1.2 which was changed a little time down the road as a mistake was caught on the chip and not the PS4 unless I have misinterpreted. But your assentation is correct since there are 8 cores on the chip and the SP4 (And Xbox for that matter) both have the memory there and its functional/used.
CasecutterYea Nvidia will continue to press those originally tasked to find a "workaround" for the necessitating in "fusing-off" defective/damaged L2 to keep refining this new process feature. This won't be an isolated case going forward, as Nvidia terms it "partial disabling" will propagate into the future. As it was said in btarunr' article yesterday, "This team (PR) was unaware that with "Maxwell," you could segment components previously thought indivisible, or that you could "partial disable" components." Nvidia will need to make such "disablements" hopefully more seamless and unnoticeable with upcoming product releases.

As for them finding any "across the board" FpS performance highly doubtful, other than a few instances, perhaps they'll get it to more usable for those with SLI. Either way it seems Nvidia is sweeping this under the rug.
Yep, the problem is more just than just wrong original specs and sweeping this under the rug as a miscommunication especially in the memory area where most of the problems lie is the root of the issues. It is an area they need to work in and next time make sure to just say it first instead of putting something that has issues and cannot be resolved (Or cannot be easily resolved). Does not make the card bad, just not advertised correctly especially to those wanting to go extreme resolution on a budget since the biggest area this is probably impactful is SLI since that was what many people I have heard chose to purchase 2 of these over a GTX 980 (Or R9 290/X).
64KThis happens on a lot of Nvdia/AMD/Intel threads. People that dislike a particular company and like another company will dredge up anything to dump in that thread. There will probably be people 10 years from now still talking about the 970 misrepresentation.
Yea that has been like half of the threads regarding this. Its all hey this is similar to how the other company did (Insert thing here), its not relevant in this instance but I have to make the other side look bad as well or the balance of the Universe is out of whack. We just need to focus on the subjects at hand and these wars will lessen as the people starting them start being ignored.
Posted on Reply
#36
newtekie1
Semi-Retired Folder
RejZoRIf you see 8 cores in Windows, any system will detect 8 cores, you'll have 8 physical threads, meaning it is in fact a 8 core CPU and they weren't lying.
Looks like Windows sees 4 cores...
RejZoRWhere NVIDIA advertised GTX 970 as 64 ROP card just to be uncovered as a 56 ROP card. Not quite the same aye?
The card does in fact have 64 ROPs. It just only uses 56 because using the others would actually make the card slower.
Posted on Reply
#37
xorbe
Everyone knew the PS4 was a shared mem architecture. NV hid the brain damaged design. Two different scenarios.
Posted on Reply
#38
Uplink10
The employee also stressed that the GTX 970 is still the best performing graphics card at its price-point
If you exclude AMD cards which have a very low price right now.
Posted on Reply
#39
Xzibit
Uplink10If you exclude AMD cards which have a very low price right now.
He is an engineer. Nvidia established that they don't read web sites. :)
Posted on Reply
#40
alwayssts
Uplink10If you exclude AMD cards which have a very low price right now.
I was going to say that was arguable dependent upon how a given 290x clocks, but then I checked the prices at Newegg.

Holy shit. 290x got super cheap. Like...wow. I was thinking they were $360-370 (or whatever the last price drop was), in which case I think the premium over a 290 vanilla was (and is) still worth it. At $280 for a 290x though (!) you're totally right. If you can handle one of those beasts, they are one hell of a deal.

I still stand by the nicety that is a short card though, as well as being able (even if by bios mods) to draw a huge amount of power over 2x6-pin. Having a highish-end card in a mid-range package is super nice...granted it's totally out of all kind of specs and perhaps bound one day to release magic smoke. The fact it exists and CAN do it though; worth the over-all smallish premium on the $/perf to me; for others they may see perf/w at stock as a boon. It's all relative (and in regards to PR-speak at that), but yeah....those 290x are a nice deal.
XzibitHe is an engineer. Nvidia established that they don't read web sites. :)
:roll:................:lovetpu:
Posted on Reply
#41
HumanSmoke
Uplink10
The employee also stressed that the GTX 970 is still the best performing graphics card at its price-point
If you exclude AMD cards which have a very low price right now.
Nvidia (and Intel) rarely if ever reference AMD by means of comparison. As market leaders they don't acknowledge a smaller player in their respective markets - standard market position strategy.
Posted on Reply
#42
ShurikN
newtekie1The card does in fact have 64 ROPs. It just only uses 56 because using the others would actually make the card slower.
Haha, that's even worse.
Posted on Reply
#43
L337One91
So what happens to those who can't return the card due to the return period having expired?
Posted on Reply
#44
Casecutter
To put an upside to this... It is just a more "granular" implementation of what Nvidia has done with segmented/asymmetrical/unbalanced memory configurations going back to the GTX 500 series.

While NVIDIA has never fully explained in-depth how such memory allocation is handled for those cards, it worked and we knew it was there (perhaps got closer now than they ever wanted). Nvidia should've at least present very top level overview, saying this is engineering based multiple generations of experience. This time I feel they didn't want it known they'd implemented it on such a high-performance card. It's not that big a deal given the price/performance, trade-offs are made for yields and offering the enormous volume they needed...

But the fact is engineering crafted it (as customer finally unearth), while marketing I really believe couldn't bring themselves to admit gritty truth and they have the blame. I said there should be some type of restitution, but as I'm not affected others can figure that out.
Posted on Reply
#45
TheGuruStud
If you can sue jimmyjohns over literally nothing, then you can definitely sue nvidia for lying in their advertisement of the product.
Posted on Reply
#46
HumanSmoke
CasecutterBut the fact is engineering crafted it (as customer finally unearth), while marketing I really believe couldn't bring themselves to admit gritty truth and they have the blame. I said there should be some type of restitution, but as I'm not affected others can figure that out.
Well, by the sounds of it Nvidia (judging by the statements and their forum reps) are moving in that direction. As for not being affected, I wouldn't let that stop you - plenty of people here don't use Nvidia products let alone the GTX 970, and it doesn't stop them shouting down the owners of the actual card being discussed. It's a pity that the maturity shown by the community when AMD* falsely advertised that their flagship offered video decode hardwareis now largely missing. We truly live in a Golden Age of Enlightenment Entitlement :roll:

* AMD acquired ATI in October 2006. the 2900 XT launched in May 2007
Posted on Reply
#47
Casecutter
HumanSmokeWe truly live in a Golden Age of Enlightenment Entitlement :roll:
Folks who buy knowing they like/want/gravitate to features being advertised, and then find that it's not as described/promised are entitled to have some course of restitution. If companies find impunity what stops the next time, or others see those guy got away with it, we are doomed to see it as you point out... again. Is this that the time (line in the sand) where we finally stand as enlighten to those goings on?

It's wrong and not something anyone should take lightly or slight.
Posted on Reply
#48
HumanSmoke
CasecutterFolks who buy knowing they like/want/gravitate to features being advertised, and then find that it's not as described/promised are entitled to have some course of restitution.
I think that is actually being addressed, is it not?
CasecutterIf companies find impunity what stops the next time, or others see those guy got away with it, we are doomed to see it as you point out... again.
So , you are of the opinion that is was a planned strategy from the get-go as well? Just as a counterpoint to that notion, AMD and B3D guru Dave Baumann's take:
Perfectly understandable that this would be "discovered" by end users rather than theoretical noodling around. For one, the tools for the end user have got better (or at least, more accessible) to be able to discover this type of thing. Second, the fundamental interconnects within a GPU are not the parts that are ever discussed, because largely they aren't necessary to know about
CasecutterIs this that the time (line in the sand) where we finally stand as enlighten to those goings on?
Well we have different five threads devoted to the subject here. I guess the next advertising/marketing misstep (assuming there is one by your reckoning) should be a doozy.
CasecutterIt's wrong and not something anyone should take lightly or slight.
The issue certainly isn't, but the level of outrage being shown over a hardware component whose performance hasn't deviated one iota since it was launched and reviewed is certainly cause for humour.
Posted on Reply
#49
xfia
alwaysstsI haven't weighed in on this whole thing as I really don't know what to add that's constructive.

I certainly have noticed micro-stutter in Mordor that I associate with this, and it sucks. It's very noticeable, even in benchmarking (where you see a very quick plummet of framerate). It would be great if I could use higher resolution/textures (memory-heavy options) and simply have a lower, if consistent experience....I feel the resolution options are granular enough (say from 2560x1440->2688x1512->etc) the space between 3.5-4 would be useful. I would have also appreciated if the L2C/SYS/XBAR clocks would have been set to GPC clocks out of the box (like previous generations?), as raising those up seemed to help things quite a bit for me. There are simply quite a few nvidia and aib (mostly evga) kerfluffles that really rub me the wrong way about this product since launch, and are only being fixed/addressed now because of greatly-appreciated community digging.

That allll said, and this doesn't excuse it, they are being addressed...and that means something. Yes, nvidia screwed the pooch by not disclosing this information at launch, but not only does this revelation not change the product, it also holds their feet to the fire for optimizations more-so than if it had been disclosed from the start. It also does not change the fact when I run at 2560x1440 (etc), which is really all I need from my viewing distance, I am still getting better performance than any other 9.5'' card on the market. I feel for the price I paid, the performance is fair. Had I paid more than I did, or adversely had they cut the product to 12SMMs/192-bit, I would likely be disappointed. This is obviously a very well thought-out and placed product, and still deserves praise for the weird amalgamation that it is.

Edit: added pic of how I changed l2c etc clocks. I know this is a common mod amongst the community, but am curious if this helped others stabilize things as much as it helped me.

not really much of a performance difference with a 290x at 1440p
Posted on Reply
#50
Kyuuba
Can software repair a broken hardware?
Posted on Reply
Add your own comment
Apr 26th, 2024 02:31 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts