• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce GTX 980 Ti Clock Speeds Revealed

Is it just me or is NVIDIA desperately trying to steer off attention from Radeon R9-390 ? Does anyone even cares about 980Ti at the moment? Especially with R9-390 being right around the corner...
Well at least Nvidia are actually releasing a card when the opposition does. I seem to remember a whole bunch of "leaked" marketing - performance, purported pictures, coming soon to a etailer near you - for the 390X just as the Titan X launched.
 
A lot of folk rubbish brands but it's really an availability heuristic. I've never had a card fail my most recent AMD were 5850's and 7970's. They didn't fail either. I'm sure you could browse selectively but this is the first thing i came across:

https://www.pugetsystems.com/labs/a...e-Rates-by-Generation-563/#NVIDIAFailureRates

No mention of brands but the generic is that AMD of late are actually worse.
Hardware France publish 6 monthly return rates from one of Europes biggest etailers (see other editions on the site to get a 3-4 year overview) The minimum sample size is fairly large. It makes some sobering reading for those AMD cheerleaders taking potshots at EVGA.

http://www.hardware.fr/articles/934-5/cartes-graphiques.html

Bear in mind that these are total returns. Failures, buyers remorse, and customer dissatisfaction ( coil whine, too high expectation etc.) are all included.
my personal experience with EVGA has been exemplary. 8800U (3), GTX 280 (2), GTX 580, GTX 780 - all overclocked, no complaints.
 
Last edited:
Is it just me or is NVIDIA desperately trying to steer off attention from Radeon R9-390 ? Does anyone even cares about 980Ti at the moment? Especially with R9-390 being right around the corner...

You mean the refreshed GPU's they re-branding and going to sell under RX 300 flag, the same that were around the corner for many years?

AMD6658.1 AMD Radeon R7 360 Graphics Bonaire XTX = Radeon R7 260X
AMD67B0.1 AMD Radeon R9 300 Series Hawaii XT = Radeon R9 290X
AMD67B1.1 AMD Radeon R9 300 Series Hawaii PRO = Radeon R9 290
AMD6810.1 AMD Radeon R7 370 Graphics Curacao XT =Radeon R9 270X
AMD6810.2 AMD Radeon R7 300 Series Curacao XT = Radeon R9 270X
AMD6811.1 AMD Radeon R7 300 Series Curacao PRO = Radeon R9 270
AMD6939.1 AMD Radeon R9 300 Series Tonga PRO =Radeon R9 285
 
Maybe because all AMD cards that have GCN already fully support DX12 and can actually afford to do that. Unlike NVIDIA which doesn't even with last gen Maxwell... Just sayin'

I'm not saying that's the best way to do things from consumer perspective, but if the prices are right, no one really cares in the end. If people will be able to get rebranded R9-290X for around 200 €, I think many will grab it. After all, this card still attacks GTX 970 despite its age.

Besides, don't be daft into thinking that they'll rebrand same GPU's for the 3rd time. R9-370X will never be based on R9-270X. R9-370X will most likely be based on R9-280X variant (R9-285X most likely).
 
LOL 'let their board partners overclock it until kingdom come'...

... dont you mean until you reach NVIDIA's abhorrently low power limit? :p
 
This is interesting.
So compared to a 780Ti it has double the VRAM, double the Pixel Fillrate (really??), but 64 less shaders and aprox same texture fill rate and memory bandwidth.
 
I am guessing (hoping) that texture fill rates and memory bandwidth are not saturated, hence the choice?
 
Is it just me or is NVIDIA desperately trying to steer off attention from Radeon R9-390 ? Does anyone even cares about 980Ti at the moment? Especially with R9-390 being right around the corner...

I think you could be right. The truth is, with DirectX 12 looming on the horizon, Nvidia will lose its long held competitive edge over AMD. There will now be a lot more competition. DX12, beyond being a godsend for developers and gamers, is also an equalizer between the GPU giants.
All in all, good times are coming for PC gamers, performance wise. With DX12, there are no more issues with providing SLI support for games. We can also for the first time consider SLI VR, a split frame rendering a realistic prospect.
 
You mean the refreshed GPU's they re-branding and going to sell under RX 300 flag, the same that were around the corner for many years?

AMD6658.1 AMD Radeon R7 360 Graphics Bonaire XTX = Radeon R7 260X
AMD67B0.1 AMD Radeon R9 300 Series Hawaii XT = Radeon R9 290X
AMD67B1.1 AMD Radeon R9 300 Series Hawaii PRO = Radeon R9 290
AMD6810.1 AMD Radeon R7 370 Graphics Curacao XT =Radeon R9 270X
AMD6810.2 AMD Radeon R7 300 Series Curacao XT = Radeon R9 270X
AMD6811.1 AMD Radeon R7 300 Series Curacao PRO = Radeon R9 270
AMD6939.1 AMD Radeon R9 300 Series Tonga PRO =Radeon R9 285
What I find odd in that derived list is how they devalued the X70 class card as just an "R7". Just saying this whole model matrix info is really “corrupted” from what we knew and had previously. I'm not putting much prominence as to that matrix info, heck this could be simply be back filling spots for OEM and Mobil, and the new discrete aftermarket stuff is 4XX series... I doubt that last driver offered these tell-tale-tidbits are even the “reviewers driver for the release”, AMD probably won't see needing to post that till the reveal at Computex.
The crystal balls are quit anxious about all this and making anything out of it.
 
Last edited:
Not to mention, in DX12, Radeons from older generation (R9-290) are serious competition against even latest GeForce cards in DX12 mode...

Though I hope multi-card setups will be more user friendly over current hacked options with game profiles. It's one of the reasons why I always strictly used just 1 GPU. Because it's guaranteed problem free. No one can say the same for any SLi or Crossfire setup.

Also, stacking more GPU's on single card should be a lot easier with DX12...
 
Not to mention, in DX12, Radeons from older generation (R9-290) are serious competition against even latest GeForce cards in DX12 mode...

Though I hope multi-card setups will be more user friendly over current hacked options with game profiles. It's one of the reasons why I always strictly used just 1 GPU. Because it's guaranteed problem free. No one can say the same for any SLi or Crossfire setup.

Also, stacking more GPU's on single card should be a lot easier with DX12...

Absolutely right. I have a twin 980 SLI setup, but even that has (at times) mediocre SLI support. It all depends on the game I'm playing. But with DX12, at least I know that SLI support will be near flawless and NOT wonky, like it's been hitherto.

Under DX12, AMD's already strong flagships will suddenly find strength that the users never knew it had.. DX12 not only equalize the playing field between the GPU giants, it also prolongs the lifetime of each gamer's graphics card! I'm so excited for this change, it's hard to not get carried away. ;)
 
AMD doesn't exactly like it (since it won't sell more of their new cards), but gamers that have R9-290's will be super happy.
 
so compared to titanx, this card probably 35% cheaper, and 10% slower ? nice.....

its definitely not nice.. for 4k, i need something faster, not slower, and 50% cheaper, not 35%..
 
Hardware France publish 6 monthly return rates from one of Europes biggest etailers (see other editions on the site to get a 3-4 year overview) The minimum sample size is fairly large. It makes some sobering reading for those AMD cheerleaders taking potshots at EVGA.

http://www.hardware.fr/articles/934-5/cartes-graphiques.html

Bear in mind that these are total returns. Failures, buyers remorse, and customer dissatisfaction ( coil whine, too high expectation etc.) are all included.
my personal experience with EVGA has been exemplary. 8800U (3), GTX 280 (2), GTX 580, GTX 780 - all overclocked, no complaints.

EVGA does not ever appear in the hardware.fr component return page in all the years I have looked at. It could be because (I speculate) the volume of sales in the EU channel is too low or hardware.fr simply cannot measure EVGA sales due to some restrictions.

Edit: If you read in the middle of this page, it lists the cards that have sold more than 200 units or 100 units in italics hardware.fr tracks to get the return rankings.
http://www.hardware.fr/articles/934-5/cartes-graphiques.html
 
Last edited:
EVGA does not ever appear in the hardware.fr component return page in all the years I have looked at. It could be because (I speculate) the volume of sales in the EU channel is too low or hardware.fr simply cannot measure EVGA sales due to some restrictions.
Probably the brand isn't high volume in France and Belgium, although it sold through the major (r)etailers
Edit: If you read in the middle of this page, it lists the cards that have sold more than 200 units or 100 units in italics hardware.fr tracks to get the return rankings.
I am actually aware of this fact, thanks. I have been following the return figures since their inception on Hardware France/BeHardware. The only reason I included the link (and the associated links that can be accessed through it) is because a minimum sample size from a major etailer should provide a better level of factual basis than some anecdotal posting by an anonymous forum member - one who seems to be waging a war on EVGA at every opportunity. Even a casual glance at EVGA's Newegg verified ownership reviews should be accorded better status against such an self-admittedly small sample size.
 
It was already known it would have 96 ROPS in order to arrive at 384 bits memory bandwidth. Now the million dollar question is whether it would have L2 cache removed and have the -0.5GB penalty.
 
It was already known it would have 96 ROPS in order to arrive at 384 bits memory bandwidth. Now the million dollar question is whether it would have L2 cache removed and have the -0.5GB penalty.

No that was the whole rub with the 970, 256-bit but ROPs disabled ("enabled but unused"). If 980 Ti has all ROPs then it probably has all L2.
 
No that was the whole rub with the 970, 256-bit but ROPs disabled ("enabled but unused"). If 980 Ti has all ROPs then it probably has all L2.

Nope, He is right. Its a matter if any L2 are disabled.

140a_thm.jpg
 
Huh, then I need to go re-read, I must have gotten confused again. =/
 
No that was the whole rub with the 970, 256-bit but ROPs disabled ("enabled but unused"). If 980 Ti has all ROPs then it probably has all L2.
That is my understanding also. ROPs and L2 are linked. Disabling ROPs/L2 (as in the 970), but maintaining an enabled memory controller causes the slowdown of the second partition. As the Anandtech article clearly states (that Xzibit used the attached diagram in his post)
In doing this, each and every 32 bit memory channel needs a direct link to the crossbar through its partner ROP/L2 unit. However in the case of the GTX 970 a wrench is thrown into the works, as there are 7 crossbar ports and 8 memory channels
If the ROPs are fully enabled, then each associated 32kB L2 slice, should also be enabled....and if the memory controller configuration is fully enabled then the issue that affected the 970 should not be relevant to the 980Ti.
 
EVGA GTX 980 Ti list from ShopBLT.

06G-P4-4990-KR NVIDIA REFERENCE FAN : $798.77
06G-P4-4991-KR EVGA ACX2.0+ COOLING : $798.77
06G-P4-4992-KR SC NVIDIA REFERENCE FAN : $815.85
06G-P4-4993-KR SC EVGA ACX2.0+ COOLING : $810.16
06G-P4-4995-KR SC+ WITH BP EVGA ACX2.0+ : $827.25

http://www.shopblt.com/search/order_id=%21ORDERID%21&s_max=25&t_all=1&s_all=GTX980TI
View attachment 65219

Not surprised. If those turn out to be true.

The bright side, at least you get Titan X performance for $200 less.
 
Ooh I don't know about that. Someone with $800 to blow on a gfx card probably has $999 for a Titan X. Not sure that is going to fly with people. Too close in pricing to the top dog.
 
Back
Top