• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Tune GTX 970 Resource Allocation with Driver Update

Joined
Sep 7, 2011
Messages
233 (0.05/day)
Location
Pekanbaru - Riau - Indonesia - Earth - Universe
System Name My Best Friend...
Processor Qualcomm Snapdragon 650
Motherboard Made By Xiaomi
Cooling Air and My Hands :)
Memory 3GB LPDDR3
Video Card(s) Adreno 510
Storage Sandisk 32GB SDHC Class 10
Display(s) 5.5" 1080p IPS BOE
Case Made By Xiaomi
Audio Device(s) Snapdragon ?
Power Supply 2A Adapter
Mouse On Screen
Keyboard On Screen
Software Android 6.0.1
Benchmark Scores 90339
It'll be a miracle, these problem can fixed only by an update driver... finger crossed :)

Oh yeah, this just in

Truth about the G-sync Marketing Module (NVIDIA using VESA Adaptive Sync Technology – Freesync)

Basically, what NVIDIA is trying to force you to do is to buy their Module license while using VESA Adaptive-Sync Technology !.

Let’s be more clear. CUDA was made to force the developers to work on a NVIDIA GPU instead of an AMD GPU. The reason is that, if CUDA was capable to be used more widely, like DirectCompute or OpenCL, CUDA will certainly work better on AMD GPU. This is not good for NVIDIA, and the same goes for PhysX that could work on AMD GPU (actually working on CPU and NVIDIA GPU only).

NVIDIA wants to dominate the GPU segment and they are ready to close everything they made and this is not good at all. Always lying to their customers and to the developers.

The main problem here is that NVIDIA doesn’t like standard, so they always make licenses for each of their products and win a lot of Royalties.

Freesync on AMD = Adaptive-Sync like G-Sync

Basically the NVIDIA drivers control everything between the G-sync Module and the Geforce GPU. The truth is that the G-sync Module does nothing else than confirm that the module is right here.

Which Monitors are compatible with the G-sync Modded Drivers ?

All the external monitors that include DP 1.2 (past 2010) and all the Laptops that include eDP.

Example : Yamakasi, Crossover, Dell, Asus (for example PB278Q), etc and MSI, Alienware, Asus, etc Laptop.


Source

I wish this wasn't true...

PS : Sorry if OOT :), really bored with 3.5GB hype..
 
Last edited:
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
seems like he may be onto something.. I mean AMD is going free for freesync and with the practically empty bank account they have. I think if he really hits the code he will come out with something more like freesync and show gsync is a bloated technology made to make them more money when the technology at hand was already capable of producing the same end game. basically what AMD is already showing to be true.

@ryun dont be so fast to dismiss this.. people where complaining about the 970 on the nvidia forum since day 1 with no real answers.
 
Last edited:
Joined
Sep 6, 2013
Messages
2,978 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
If the above is true even partially, it will mean that Freesync/Adaptive Sync produces the same result as GSync for no cost.
 
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016


When you say Nvidia should compensate GTX 970 remember that others should do it too.
If your going to make a joke like that you need to at least make it right. There are 8 cores inside an FX 8XXX series and 9XXX series chip, its 4 modules 8 cores because there are 2 cores per module meaning it really is an 8 core.

Also the PS4 and XB0X ONE do have 8gb of memory inside but some is reserved at times for system resources. However recently they have even stated they are/will be allowing more and more to be accessed for games.

Anyway, either way any update will help the GTX 970 but it will never alleviate the root cause completely. It might be better if they find a way to utilize the ~500mb in a different method that will alleviate the main ram for its necessary tasks. Could be a good way to get some extra performance in a way that could make the 3.5gb feel a bit larger and use it effectively though that is just a random thought. Any performance work is good for the owners or for the future so if they can help it at all that is going to be nice.

Also why are we discussing G-Sync vs FreeSync on this thread? I do not see the relevance?
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.65/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Also why are we discussing G-Sync vs FreeSync on this thread? I do not see the relevance?

This happens on a lot of Nvdia/AMD/Intel threads. People that dislike a particular company and like another company will dredge up anything to dump in that thread. There will probably be people 10 years from now still talking about the 970 misrepresentation.
 
Joined
Apr 19, 2011
Messages
2,198 (0.46/day)
Location
So. Cal.
Anyway, either way any update will help the GTX 970 but it will never alleviate the root cause completely.
Yea Nvidia will continue to press those originally tasked to find a "workaround" for the necessitating in "fusing-off" defective/damaged L2 to keep refining this new process feature. This won't be an isolated case going forward, as Nvidia terms it "partial disabling" will propagate into the future. As it was said in btarunr' article yesterday, "This team (PR) was unaware that with "Maxwell," you could segment components previously thought indivisible, or that you could "partial disable" components." Nvidia will need to make such "disablements" hopefully more seamless and unnoticeable with upcoming product releases.

As for them finding any "across the board" FpS performance highly doubtful, other than a few instances, perhaps they'll get it to more usable for those with SLI. Either way it seems Nvidia is sweeping this under the rug.
 
Joined
Sep 3, 2013
Messages
75 (0.02/day)
Processor Intel i7 6700k
Motherboard Gigabyte Z170X-Gaming 7
Cooling Nocuta nh-d14
Memory Patriot 16GB
Video Card(s) Gibabyte GeForce GTX 1080Ti
Storage (system) Corsair 120GB Force GT, (games) 500GB 840, (data) Seagate 1TB Barracuda + WD 1TB Black
Display(s) AOC AG271QG
Case NZXT Phantom
Power Supply Corsair ax1200i
Mouse g502
Keyboard G710+
Software Win10
It'll be a miracle, these problem can fixed only by an update driver... finger crossed :)

Oh yeah, this just in

Truth about the G-sync Marketing Module (NVIDIA using VESA Adaptive Sync Technology – Freesync)






Source

I wish this wasn't true...

PS : Sorry if OOT :), really bored with 3.5GB hype..


haha read that just a minute ago! :) Well, well what d'ya know! :O
 
Joined
May 13, 2008
Messages
669 (0.11/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
I haven't weighed in on this whole thing as I really don't know what to add that's constructive.

I certainly have noticed micro-stutter in Mordor that I associate with this, and it sucks. It's very noticeable, even in benchmarking (where you see a very quick plummet of framerate). It would be great if I could use higher resolution/textures (memory-heavy options) and simply have a lower, if consistent experience....I feel the resolution options are granular enough (say from 2560x1440->2688x1512->etc) the space between 3.5-4 would be useful. I would have also appreciated if the L2C/SYS/XBAR clocks would have been set to GPC clocks out of the box (like previous generations?), as raising those up seemed to help things quite a bit for me. There are simply quite a few nvidia and aib (mostly evga) kerfluffles that really rub me the wrong way about this product since launch, and are only being fixed/addressed now because of greatly-appreciated community digging.

That allll said, and this doesn't excuse it, they are being addressed...and that means something. Yes, nvidia screwed the pooch by not disclosing this information at launch, but not only does this revelation not change the product, it also holds their feet to the fire for optimizations more-so than if it had been disclosed from the start. It also does not change the fact when I run at 2560x1440 (etc), which is really all I need from my viewing distance, I am still getting better performance than any other 9.5'' card on the market. I feel for the price I paid, the performance is fair. Had I paid more than I did, or adversely had they cut the product to 12SMMs/192-bit, I would likely be disappointed. This is obviously a very well thought-out and placed product, and still deserves praise for the weird amalgamation that it is.

Edit: added pic of how I changed l2c etc clocks. I know this is a common mod amongst the community, but am curious if this helped others stabilize things as much as it helped me.

 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)


When you say Nvidia should compensate GTX 970 remember that others should do it too.

If you see 8 cores in Windows, any system will detect 8 cores, you'll have 8 physical threads, meaning it is in fact a 8 core CPU and they weren't lying. What's the design behind those 8 threads is irrelevant, because the fact still stands that you have 8 physical threads. Unlike with missing ROP's that are just, missing (or shall I say not functional) PHYSICALLY. Where NVIDIA advertised GTX 970 as 64 ROP card just to be uncovered as a 56 ROP card. Not quite the same aye?

And for the memory capacity. It says 8GB and guess what, PS4 has 8GB of memory. Are you saying it doesn't have it on PCB? Oh wait... No one said GTX 970 doesn't have 4GB of memory. Because we all know that it does. But the way how it's accessing it and utilizing it, that's the shitty part and the reason for the outrage beyond 3,5GB. Get your facts straight man...

Would you mind telling us where did you get the 1.2 billion transistor count? From speculations on desktop Jaguar cores? PS4 ain't desktop system you know, it's custom designed for PS4 specifically and based upon desktop components...
 
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
If you see 8 cores in Windows, any system will detect 8 cores, you'll have 8 physical threads, meaning it is in fact a 8 core CPU and they weren't lying. What's the design behind those 8 threads is irrelevant, because the fact still stands that you have 8 physical threads. Unlike with missing ROP's that are just, missing (or shall I say not functional) PHYSICALLY. Where NVIDIA advertised GTX 970 as 64 ROP card just to be uncovered as a 56 ROP card. Not quite the same aye?

And for the memory capacity. It says 8GB and guess what, PS4 has 8GB of memory. Are you saying it doesn't have it on PCB? Oh wait... No one said GTX 970 doesn't have 4GB of memory. Because we all know that it does. But the way how it's accessing it and utilizing it, that's the shitty part and the reason for the outrage beyond 3,5GB. Get your facts straight man...

Would you mind telling us where did you get the 1.2 billion transistor count? From speculations on desktop Jaguar cores? PS4 ain't desktop system you know, it's custom designed for PS4 specifically and based upon desktop components...
The transistor count he's referring to is the bulldozer architecture change from 2 to 1.2 which was changed a little time down the road as a mistake was caught on the chip and not the PS4 unless I have misinterpreted. But your assentation is correct since there are 8 cores on the chip and the SP4 (And Xbox for that matter) both have the memory there and its functional/used.

Yea Nvidia will continue to press those originally tasked to find a "workaround" for the necessitating in "fusing-off" defective/damaged L2 to keep refining this new process feature. This won't be an isolated case going forward, as Nvidia terms it "partial disabling" will propagate into the future. As it was said in btarunr' article yesterday, "This team (PR) was unaware that with "Maxwell," you could segment components previously thought indivisible, or that you could "partial disable" components." Nvidia will need to make such "disablements" hopefully more seamless and unnoticeable with upcoming product releases.

As for them finding any "across the board" FpS performance highly doubtful, other than a few instances, perhaps they'll get it to more usable for those with SLI. Either way it seems Nvidia is sweeping this under the rug.
Yep, the problem is more just than just wrong original specs and sweeping this under the rug as a miscommunication especially in the memory area where most of the problems lie is the root of the issues. It is an area they need to work in and next time make sure to just say it first instead of putting something that has issues and cannot be resolved (Or cannot be easily resolved). Does not make the card bad, just not advertised correctly especially to those wanting to go extreme resolution on a budget since the biggest area this is probably impactful is SLI since that was what many people I have heard chose to purchase 2 of these over a GTX 980 (Or R9 290/X).

This happens on a lot of Nvdia/AMD/Intel threads. People that dislike a particular company and like another company will dredge up anything to dump in that thread. There will probably be people 10 years from now still talking about the 970 misrepresentation.
Yea that has been like half of the threads regarding this. Its all hey this is similar to how the other company did (Insert thing here), its not relevant in this instance but I have to make the other side look bad as well or the balance of the Universe is out of whack. We just need to focus on the subjects at hand and these wars will lessen as the people starting them start being ignored.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
If you see 8 cores in Windows, any system will detect 8 cores, you'll have 8 physical threads, meaning it is in fact a 8 core CPU and they weren't lying.


Looks like Windows sees 4 cores...


Where NVIDIA advertised GTX 970 as 64 ROP card just to be uncovered as a 56 ROP card. Not quite the same aye?

The card does in fact have 64 ROPs. It just only uses 56 because using the others would actually make the card slower.
 
Last edited:
Joined
Feb 14, 2012
Messages
2,323 (0.52/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
Everyone knew the PS4 was a shared mem architecture. NV hid the brain damaged design. Two different scenarios.
 
Joined
Dec 16, 2014
Messages
421 (0.12/day)
The employee also stressed that the GTX 970 is still the best performing graphics card at its price-point
If you exclude AMD cards which have a very low price right now.
 
Joined
May 13, 2008
Messages
669 (0.11/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
If you exclude AMD cards which have a very low price right now.

I was going to say that was arguable dependent upon how a given 290x clocks, but then I checked the prices at Newegg.

Holy shit. 290x got super cheap. Like...wow. I was thinking they were $360-370 (or whatever the last price drop was), in which case I think the premium over a 290 vanilla was (and is) still worth it. At $280 for a 290x though (!) you're totally right. If you can handle one of those beasts, they are one hell of a deal.

I still stand by the nicety that is a short card though, as well as being able (even if by bios mods) to draw a huge amount of power over 2x6-pin. Having a highish-end card in a mid-range package is super nice...granted it's totally out of all kind of specs and perhaps bound one day to release magic smoke. The fact it exists and CAN do it though; worth the over-all smallish premium on the $/perf to me; for others they may see perf/w at stock as a boon. It's all relative (and in regards to PR-speak at that), but yeah....those 290x are a nice deal.

He is an engineer. Nvidia established that they don't read web sites. :)

:roll:................:lovetpu:
 
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
The employee also stressed that the GTX 970 is still the best performing graphics card at its price-point
If you exclude AMD cards which have a very low price right now.
Nvidia (and Intel) rarely if ever reference AMD by means of comparison. As market leaders they don't acknowledge a smaller player in their respective markets - standard market position strategy.
 
Joined
Nov 3, 2013
Messages
2,141 (0.56/day)
Location
Serbia
Processor Ryzen 5600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 6700 10GB SWFT 309
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703 (TTC Gold 60M)
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (Gateron milky yellow)
Software W10
The card does in fact have 64 ROPs. It just only uses 56 because using the others would actually make the card slower.
Haha, that's even worse.
 
Joined
Jan 31, 2014
Messages
5 (0.00/day)
So what happens to those who can't return the card due to the return period having expired?
 
Joined
Apr 19, 2011
Messages
2,198 (0.46/day)
Location
So. Cal.
To put an upside to this... It is just a more "granular" implementation of what Nvidia has done with segmented/asymmetrical/unbalanced memory configurations going back to the GTX 500 series.

While NVIDIA has never fully explained in-depth how such memory allocation is handled for those cards, it worked and we knew it was there (perhaps got closer now than they ever wanted). Nvidia should've at least present very top level overview, saying this is engineering based multiple generations of experience. This time I feel they didn't want it known they'd implemented it on such a high-performance card. It's not that big a deal given the price/performance, trade-offs are made for yields and offering the enormous volume they needed...

But the fact is engineering crafted it (as customer finally unearth), while marketing I really believe couldn't bring themselves to admit gritty truth and they have the blame. I said there should be some type of restitution, but as I'm not affected others can figure that out.
 
Last edited:
Joined
Sep 15, 2007
Messages
3,944 (0.65/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
If you can sue jimmyjohns over literally nothing, then you can definitely sue nvidia for lying in their advertisement of the product.
 
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
But the fact is engineering crafted it (as customer finally unearth), while marketing I really believe couldn't bring themselves to admit gritty truth and they have the blame. I said there should be some type of restitution, but as I'm not affected others can figure that out.
Well, by the sounds of it Nvidia (judging by the statements and their forum reps) are moving in that direction. As for not being affected, I wouldn't let that stop you - plenty of people here don't use Nvidia products let alone the GTX 970, and it doesn't stop them shouting down the owners of the actual card being discussed. It's a pity that the maturity shown by the community when AMD* falsely advertised that their flagship offered video decode hardware is now largely missing. We truly live in a Golden Age of Enlightenment Entitlement :roll:

* AMD acquired ATI in October 2006. the 2900 XT launched in May 2007
 
Joined
Apr 19, 2011
Messages
2,198 (0.46/day)
Location
So. Cal.
We truly live in a Golden Age of Enlightenment Entitlement :roll:
Folks who buy knowing they like/want/gravitate to features being advertised, and then find that it's not as described/promised are entitled to have some course of restitution. If companies find impunity what stops the next time, or others see those guy got away with it, we are doomed to see it as you point out... again. Is this that the time (line in the sand) where we finally stand as enlighten to those goings on?

It's wrong and not something anyone should take lightly or slight.
 
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Folks who buy knowing they like/want/gravitate to features being advertised, and then find that it's not as described/promised are entitled to have some course of restitution.
I think that is actually being addressed, is it not?
If companies find impunity what stops the next time, or others see those guy got away with it, we are doomed to see it as you point out... again.
So , you are of the opinion that is was a planned strategy from the get-go as well? Just as a counterpoint to that notion, AMD and B3D guru Dave Baumann's take:
Perfectly understandable that this would be "discovered" by end users rather than theoretical noodling around. For one, the tools for the end user have got better (or at least, more accessible) to be able to discover this type of thing. Second, the fundamental interconnects within a GPU are not the parts that are ever discussed, because largely they aren't necessary to know about
Is this that the time (line in the sand) where we finally stand as enlighten to those goings on?
Well we have different five threads devoted to the subject here. I guess the next advertising/marketing misstep (assuming there is one by your reckoning) should be a doozy.
It's wrong and not something anyone should take lightly or slight.
The issue certainly isn't, but the level of outrage being shown over a hardware component whose performance hasn't deviated one iota since it was launched and reviewed is certainly cause for humour.
 
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
I haven't weighed in on this whole thing as I really don't know what to add that's constructive.

I certainly have noticed micro-stutter in Mordor that I associate with this, and it sucks. It's very noticeable, even in benchmarking (where you see a very quick plummet of framerate). It would be great if I could use higher resolution/textures (memory-heavy options) and simply have a lower, if consistent experience....I feel the resolution options are granular enough (say from 2560x1440->2688x1512->etc) the space between 3.5-4 would be useful. I would have also appreciated if the L2C/SYS/XBAR clocks would have been set to GPC clocks out of the box (like previous generations?), as raising those up seemed to help things quite a bit for me. There are simply quite a few nvidia and aib (mostly evga) kerfluffles that really rub me the wrong way about this product since launch, and are only being fixed/addressed now because of greatly-appreciated community digging.

That allll said, and this doesn't excuse it, they are being addressed...and that means something. Yes, nvidia screwed the pooch by not disclosing this information at launch, but not only does this revelation not change the product, it also holds their feet to the fire for optimizations more-so than if it had been disclosed from the start. It also does not change the fact when I run at 2560x1440 (etc), which is really all I need from my viewing distance, I am still getting better performance than any other 9.5'' card on the market. I feel for the price I paid, the performance is fair. Had I paid more than I did, or adversely had they cut the product to 12SMMs/192-bit, I would likely be disappointed. This is obviously a very well thought-out and placed product, and still deserves praise for the weird amalgamation that it is.

Edit: added pic of how I changed l2c etc clocks. I know this is a common mod amongst the community, but am curious if this helped others stabilize things as much as it helped me.



not really much of a performance difference with a 290x at 1440p
 
Top