Thursday, September 15th 2016

NVIDIA GeForce GTX 1080 Ti Specifications Leaked, Inbound for Holiday 2016?

NVIDIA is giving finishing touches to its next enthusiast-segment graphics card based on the "Pascal" architecture, the GeForce GTX 1080 Ti. Its specifications were allegedly screengrabbed by a keen-eyed enthusiast snooping around NVIDIA website, before being redacted. The specs-sheet reveals that the GTX 1080 Ti is based on the same GP102 silicon as the TITAN X Pascal, but is further cut-down from it. Given that the GTX 1080 is unflinching from its $599-$699 price-point, with some custom-design cards even being sold at over $800, the GTX 1080 Ti could either be positioned around the $850-mark, or be priced lower, disrupting currently overpriced custom GTX 1080 offerings. By pricing the TITAN X Pascal at $1200, NVIDIA appears to have given itself headroom to price the GTX 1080 Ti in a way that doesn't cannibalize premium GTX 1080 offerings.

The GTX 1080 Ti is carved out of the GP102 silicon by disabling 4 out of 30 streaming multiprocessors, resulting in 3,328 CUDA cores. The resulting TMU count is 208. The card could retain its ROP count of 96. The card will be endowed with 12 GB of GDDR5 memory across the chip's 384-bit wide memory interface, instead of GDDR5X on the TITAN X Pascal. This should yield 384 GB/s of memory bandwidth, significantly lesser than the 480 GB/s bandwidth the TITAN X Pascal enjoys, with its 10 Gbps memory chips. The GPU is clocked at 1503 MHz, with 1623 MHz GPU Boost. The card's TDP is rated at 250W, same as the TITAN X Pascal.
GeForce GTX 1080 Ti Specifications:
  • 16 nm GP102 silicon
  • 3,328 CUDA cores
  • 208 TMUs
  • 96 ROPs
  • 12 GB GDDR5 memory
  • 384-bit GDDR5 memory interface
  • 1503 MHz core, 1623 MHz GPU Boost
  • 8 GHz (GDDR5-effective) memory
  • 384 GB/s memory bandwidth
  • 250W TDP
Source: OC3D
Add your own comment

176 Comments on NVIDIA GeForce GTX 1080 Ti Specifications Leaked, Inbound for Holiday 2016?

#126
Melvis
I wish it was that price here in Aus, the GTX 1080 is already around $1000-$1300, so this is going to be around $1200-$1500, no thank you! ill just get another second hand GTX 970 for $250 and call it a day.
Posted on Reply
#127
efikkan
the54thvoidAlso, using Doom Vulkan is an excellent gauge for the future. Made with explicit AMD extensions (because Nv don't have them) shows about the best case scenario, IMO, for AMD's future performance). So, given that it's hard to see how much farther GCN can go (and Navi won't have it) and Titan XP (unrealistic card but shows Nvidia's fastest) is far ahead even in Vulkan, it's very hard to see Captain Tom's future.
You are talking about an edge case. Of course, all the PR departments of Intel, AMD, Nvidia, etc. loves to bring up these cases which displaces the competition and sheds the best possible light on their own product.
The problem with GCN is architectural inefficiencies, and cases such as Doom does nothing to fix that:
- Vendor "optimized" pipelines (e.g. console ports) just makes the competition less efficient, not GCN actually that much better.
- Vendor specific extensions might include tricks, but still does not help the architectural inefficiencies. Most such tricks does not apply to all most cases.
the54thvoidThen there is the elephant in the room which few have had the reasoning to spot. The AMD resurgence is clearly based on the move from DX11 to DX12 and one game using Vulkan (again with explicit AMD extensions). Using this new paradigm, we can expect no similar performance improvements from AMD over Pascal in these API's.
Similar to what? Doom Vulkan vs. OpenGL? Do I need to remind you that AMD's OpenGL support is extremely unstable?

Why would "the new paradigm" suddenly dissipate the architectural problems of GCN? Don't you know that the APIs have nothing to do with how internal GPU scheduling (on the level each GPU thread), GPU memory fetches, etc.? And if the APIs were holding AMD back all these years, how come Nvidia were not held back? You better explain yourself.
the54thvoidThe situation of graphics cards will remain as it has with DX11. A game developed with assistance from AMD or Nv will favour that card. Hitman and Deus Ex both favour AMD. Both were developed in the Nvidia classic style of, 'lets hamper the competition'. Just like TWIMTBP games tend to highlight Nv abilities at the expense of AMD.
Oh conspiracies!
No one "ever" intentionally "hamper the competition". The real problem is when a game is developed with no or little testing on the other vendor throughout the whole development cycle. If the day-to-day development and testing is all done on one vendor, then it's easy to do design-choices which puts the other vendor at a disadvantage. This typically results in bottlenecks and scaling issues for the other vendor. It might not be easy to fine tune this later. We have always had some AMD(/ATI) and Nvidia biased games, but the amount of AMD biased games has increased because of both PS4 and Xbox One being AMD based.
the54thvoidDx12 etc will help AMD achieve greater parity but given the Titan XP with fewer shaders than Fury X still soundly beats it in everything (faster clocks but like peeps say, no Async or DX12 magic) then you have to wonder how bad it might be when Nvidia bring back a little parallel async compute based hardware...
Please explain precisely what will make GCN suddenly grow past it's design faults?

This kind of reminds me of the good old Bulldozer days, when all the fans were screaming that new software will make AMD overcome all the issues. :p It did of course never happen, and AMD finally have discarded the inefficient architecture in favor of a architecture more similar to the competition.
Posted on Reply
#128
the54thvoid
Intoxicated Moderator
efikkanYou are talking about an edge case. Of course, all the PR departments of Intel, AMD, Nvidia, etc. loves to bring up these cases which displaces the competition and sheds the best possible light on their own product.
The problem with GCN is architectural inefficiencies, and cases such as Doom does nothing to fix that:
- Vendor "optimized" pipelines (e.g. console ports) just makes the competition less efficient, not GCN actually that much better.
- Vendor specific extensions might include tricks, but still does not help the architectural inefficiencies. Most such tricks does not apply to all most cases.


Similar to what? Doom Vulkan vs. OpenGL? Do I need to remind you that AMD's OpenGL support is extremely unstable?

Why would "the new paradigm" suddenly dissipate the architectural problems of GCN? Don't you know that the APIs have nothing to do with how internal GPU scheduling (on the level each GPU thread), GPU memory fetches, etc.? And if the APIs were holding AMD back all these years, how come Nvidia were not held back? You better explain yourself.


Oh conspiracies!
No one "ever" intentionally "hamper the competition". The real problem is when a game is developed with no or little testing on the other vendor throughout the whole development cycle. If the day-to-day development and testing is all done on one vendor, then it's easy to do design-choices which puts the other vendor at a disadvantage. This typically results in bottlenecks and scaling issues for the other vendor. It might not be easy to fine tune this later. We have always had some AMD(/ATI) and Nvidia biased games, but the amount of AMD biased games has increased because of both PS4 and Xbox One being AMD based.


Please explain precisely what will make GCN suddenly grow past it's design faults?

This kind of reminds me of the good old Bulldozer days, when all the fans were screaming that new software will make AMD overcome all the issues. :p It did of course never happen, and AMD finally have discarded the inefficient architecture in favor of a architecture more similar to the competition.
All your replies shall serve as the counter arguments to other posts arguing against mine. My post is a best case scenario for AMD using 'populist' beliefs about API's and hardware. Thank you for laying them bare!
Posted on Reply
#129
Captain_Tom
efikkanWhat you are describing is totally impossible. The new APIs will not and can not counter the inefficiencies in the GCN architecture, and will not result in a 50% relative gain for AMD vs Nvidia. The architectural inefficiencies in GCN are not software, it's hardware design.

The only path forward is architectural overhaul. Volta is going to be a bigger architectural change than Pascal, while AMD has stuck to their GCN since the Kepler days of Nvidia.
What 50% gain? At launch (And Stock settings) the Fury X was trading blows with the 980 Ti/Titan X. The 1080 is only 25% stronger than those cards, so the Fury X would only need to gain 25% relative performance, which isn't a big number at all.

Even that OG Titan vs 7970 example isn't as big as you are describing: The OG Titan was only 35-40% stronger than the 7970 lol.
Posted on Reply
#130
Captain_Tom
mcraygsxSeems to be good 30% boost from 1080 but I wish NVidia would stick to GDDRx, GTX 1080 still has outstanding 320 GB/sec.
The only thing I want to point out is that GDDR5 overclocks WAY WAY better than GDDR5X. The highest I have EVER seen GDDR5X get to is 11,000 Effective, whereas plenty of GDDR5 chips can hit 9600. That only puts the GDDR5X 15% faster, while it costs a decent amount more.


1080 Ti is a mass market chip so I actually think it is a very good decision considering they will sell just as many to the lemmings either way.
Posted on Reply
#131
64K
Captain_Tom1080 Ti is a mass market chip so I actually think it is a very good decision considering they will sell just as many to the lemmings either way.
Very few people buy high end GPUs for gaming. It's not a big income generator for Nvidia or AMD.

Probably that greedy Nvidia wants to make a profit so that they can stay in business unlike the saints at AMD that only want to sell everything to cheap to make a decent profit and go bankrupt. :)
Posted on Reply
#132
efikkan
Captain_TomWhat 50% gain? At launch (And Stock settings) the Fury X was trading blows with the 980 Ti/Titan X. The 1080 is only 25% stronger than those cards, so the Fury X would only need to gain 25% relative performance, which isn't a big number at all.

Even that OG Titan vs 7970 example isn't as big as you are describing: The OG Titan was only 35-40% stronger than the 7970 lol.
Have you forgotten your own claims from yesterday:
Captain_TomWhen it comes to actual final performance numbers (Once the dust settles: I think the best indicators you can look at are a combination of TFLOPS and Bandwidth.
-Fury OC / Fury X will = 1080
-480 will be like 10% behind the 1070
...
They would need a 40-50% gain to achieve this, and it will never happen.

Even if you moderate yourself to "only" 25% now, how will Fury X be able to achieve that?
Posted on Reply
#133
Captain_Tom
64KVery few people buy high end GPUs for gaming. It's not a big income generator for Nvidia or AMD.

Probably that greedy Nvidia wants to make a profit so that they can stay in business unlike the saints at AMD that only want to sell everything to cheap to make a decent profit and go bankrupt. :)
Considering the price gouging AMD pulled in old FX days, I wouldn't call them saints buddy.
Posted on Reply
#134
RJ
I expect 1080ti to launch somewhere between $649-799, perform within 5% of the Pascal Titan X but have less VRAM, pretty much the same deal as with Titan X and 980ti.

It's not quite the card to master 4K/60 although it will get very close. The next x80 card NV launches will get it done but by then 4K/60Hz+ will be a thing.
Posted on Reply
#135
Captain_Tom
RJI expect 1080ti to launch somewhere between $649-799, perform within 5% of the Pascal Titan X but have less VRAM, pretty much the same deal as with Titan X and 980ti.

It's not quite the card to master 4K/60 although it will get very close. The next x80 card NV launches will get it done but by then 4K/60Hz+ will be a thing.
Idk this time the specs are different enough that I think this will be 10-20% weaker at stock. However like I previously said: GDDR5 overclocks better than GDDR5X, and the better coolers will allow slightly better core clocks. Overall I would expect a 7-10% difference when both are overclocked (Whereas before they were nearly equal).

This card will be 15-20% stronger than the 1080 and cost $750 with $850 for the Founders Edition. Thus $800 price in reality.
Posted on Reply
#136
ppn
RJI expect 1080ti to launch somewhere between $649-799, perform within 5% of the Pascal Titan X but have less VRAM, pretty much the same deal as with Titan X and 980ti.
Less RAM. Could it be halved, 6GB vs 12. just like 980Ti/TitanX. No. If they released Titan PAscal as 24GB, Yes. But they didn't.

Remember how GTX 770 was released in may 2013 and GTX 970 September 2014 that was exactly 60% faster. Much like 1080Ti is to be exactly 60% faster than GTX 1070.

1080Ti can't be more expensive than the SLI that it replaces. 1070 will probably drop to GTX 1160 level. just like GTX670 did as GTX 760 was very close .

The lesson, Can't make the GTX 970 SLI work for longetivity, it was replaced by 980Ti soon after, and the 980Ti reference was replaced by GTX 1060 @ 2.2Ghz.

Can't make the GTX 1070 work, it will be replaced by 1080Ti, and 1080Ti reference will be replaced by....
Posted on Reply
#137
dalekdukesboy
the54thvoidComparing cards using the Deus Ex MD benchmark isn't demonstrative of actual game performance. There are other Deus Ex MD reviews that show Fury X behind 1080. Cherry picking Guru 3D, who AMD fans often slag off for some reason doesn't illustrate anything.

Also, using Doom Vulkan is an excellent gauge for the future. Made with explicit AMD extensions (because Nv don't have them) shows about the best case scenario, IMO, for AMD's future performance). So, given that it's hard to see how much farther GCN can go (and Navi won't have it) and Titan XP (unrealistic card but shows Nvidia's fastest) is far ahead even in Vulkan, it's very hard to see Captain Tom's future.

Then there is the elephant in the room which few have had the reasoning to spot. The AMD resurgence is clearly based on the move from DX11 to DX12 and one game using Vulkan (again with explicit AMD extensions). Using this new paradigm, we can expect no similar performance improvements from AMD over Pascal in these API's.
The situation of graphics cards will remain as it has with DX11. A game developed with assistance from AMD or Nv will favour that card. Hitman and Deus Ex both favour AMD. Both were developed in the Nvidia classic style of, 'lets hamper the competition'. Just like TWIMTBP games tend to highlight Nv abilities at the expense of AMD.
Dx12 etc will help AMD achieve greater parity but given the Titan XP with fewer shaders than Fury X still soundly beats it in everything (faster clocks but like peeps say, no Async or DX12 magic) then you have to wonder how bad it might be when Nvidia bring back a little parallel async compute based hardware...

And yes. I can compare Pascal to Fiji because all a die shrink does is (simplistically) reduce power use and increase the ability to throw on more hardware. Nv used the shrink to keep the die reasonably clean but bring up clocks.

Anyway, it'll be fun when Vega arrives because with Fury X level of cores on 14nm, it should be clocked far higher. That alone with some GCN tweaks should overtake the 1080. But then Nvidia will react with 'something'. 2017 is worth talking about because Vega will give us some solid numbers to discuss but this will ring true - if in 2017, a Titan XP beats Vega in an AMD Vulkan game, AMD are in trouble. If on the other hand Vega beats Titan, AMD will rightly be confident of a rosy future.

Until Vega is out, all of these awful conversations (including mine) are about as insightful as a cat farting. The proof of science is in the testing and we can't test that future till it's here.
I like the post overall, mostly true and self-depricating considering you lump your own lengthy post in with the rest being as useful as a cat fart. However I disagree with the idea people (like me) missed any elephant trouncing around, we mentioned heavily Vulkan, and dx12 I believe was mentioned and if not it intrinsically goes in hand with DX12 and that pair is what Major Tom from outer space is very faultily basing a cat fart-type argument on.
Posted on Reply
#138
RJ
Captain_TomIdk this time the specs are different enough that I think this will be 10-20% weaker at stock. However like I previously said: GDDR5 overclocks better than GDDR5X, and the better coolers will allow slightly better core clocks. Overall I would expect a 7-10% difference when both are overclocked (Whereas before they were nearly equal).

This card will be 15-20% stronger than the 1080 and cost $750 with $850 for the Founders Edition. Thus $800 price in reality.
I forgot that the 980TI launched at $699, I thought it was $649. I agree it's likely to launch at $749 and the founder version will again be the first in stock, at a premium.
The rise in pricing as Nvidia captured market share has led me to approach upgrades differently:
I just bought my second 980TI for $300. A SLI setup can be had for $600, 50% of Pascal Titan's price, at 25%-33% more performance than a $1200 Pascal Titan.

Reference for numbers are here: www.guru3d.com/articles-pages/nvidia-geforce-titan-x-pascal-review,26.html and in my system specs.
9K Pascal graphics score, my 980TI SLI does 12K: www.3dmark.com/3dm/14930947
Posted on Reply
#139
Captain_Tom
dalekdukesboyI like the post overall, mostly true and self-depricating considering you lump your own lengthy post in with the rest being as useful as a cat fart. However I disagree with the idea people (like me) missed any elephant trouncing around, we mentioned heavily Vulkan, and dx12 I believe was mentioned and if not it intrinsically goes in hand with DX12 and that pair is what Major Tom from outer space is very faultily basing a cat fart-type argument on.
What is this "Captain_Tom's Future" you guys are talking about? I am saying the Fury X will roughly match the 1080 within a year - again if I am wrong you can remind me later. I never expect the Fury X to match the Titan XP, but if it did I wouldn't be completely surprised.
Posted on Reply
#140
dalekdukesboy
I just got my first 980 ti for same sum, exactly 300 dollars, pretty hard to find them at that price but if you're luck you can find them and get
a good deal right now.
Posted on Reply
#141
Captain_Tom
dalekdukesboyI just got my first 980 ti for same sum, exactly 300 dollars, pretty hard to find them at that price but if you're luck you can find them and get
a good deal right now.
Just curious where are you finding these deals? My friend is building right now...
Posted on Reply
#142
dalekdukesboy
I just lucked out on Ebay. No idea where the other bloke got his from obviously but he said same amount. I just checked on ebay few that are slightly under 300 most way over and all on auction, mine was a buy it now at that price so I just jumped on it I needed a new card my 980 had died.
Posted on Reply
#143
Captain_Tom
dalekdukesboyI just lucked out on Ebay. No idea where the other bloke got his from obviously but he said same amount. I just checked on ebay few that are slightly under 300 most way over and all on auction, mine was a buy it now at that price so I just jumped on it I needed a new card my 980 had died.
Brand of 980?



And yeah I got lucky a couple years ago when I was crypto mining. Found a bunch of 7950's for $100 each lol (This was in 2014, even today that would be an insane deal). They all overclocked to 1150/1800!
Posted on Reply
#144
dalekdukesboy
My croaked card? was a gaming 980 Msi, dead silent, great cooler, clocked well just a good card in every way same as review Wiz had of it. Yeah I had a 7970 I used for quite a while sold that to upgrade to the 980 about a year ago. 980 would still be in the pc but stray fan clip (metal) touched the back of it and no backplate, yeah fried that card...first ever card I bricked.
Posted on Reply
#145
Captain_Tom
dalekdukesboyMy croaked card? was a gaming 980 Msi, dead silent, great cooler, clocked well just a good card in every way same as review Wiz had of it. Yeah I had a 7970 I used for quite a while sold that to upgrade to the 980 about a year ago. 980 would still be in the pc but stray fan clip (metal) touched the back of it and no backplate, yeah fried that card...first ever card I bricked.
Shit there's some bad luck - sorry. Personally I wish there was a dominator like the 7970 @ 1220/1830 I had. At those clocks in 2012 I was laughing at the framerates I was getting...

Only paid $390 for it too.
Posted on Reply
#146
RJ
I bought my 980TI on ebay too because I kept missing the sales on forums, happened to be the same model as the other card I bought earlier on a tech forum.

Speaking of old cards, my 970 was nothing special but the previous card is unbeatable in terms of value over time, a 7950 MSI TF3. Got it on Amazon for $309, as soon as they launched. Paid itself off via bitcoin and then some. Eventually upgraded to 970, put a system together from old parts with the 7950 and gave it to a friend, still going strong. Had a great ASIC score too, needed little voltage and even on air it overclocked to 1250/1850, a stock card comes in at 880 on the core, this is a link to the tests I ran: forums.anandtech.com/threads/radeon-hd-7950-owners-thread.2259333/page-10#post-33822143
Posted on Reply
#147
Vayra86
Captain_TomHahaha good joke. I love that the 680 is vastly weaker than the 7970, and yet people bought it because it is "The Way It's Meant to be Played".
Eh... what?

www.babeltechreviews.com/hd-7970-vs-gtx-680-2013-revisited/3/

They're about equal, with the 7970 only winning in heavily AMD favored DIRT and at 4K, where both cards produce unplayable frame rates. Meanwhile in Dying Light, the 680 scores a good 15 fps more. Overall the 680 can definitely be considered a better choice as it OC"s better and at the time, 3GB was overkill for most games and resolutions.

At launch, the 680 was overall 10% faster than the 7970. They both age well to be honest. The reason you wouldn't buy the 680 was a different one: price. The much cheaper 670 could do almost as well as the 680.
Posted on Reply
#148
RJ
My 7950 was faster than the stock blower 7970's because the cooling and the binning were better, it's OC gave it the legroom to go beyond what some 7970's could achieve on stock or mediocre cooling. It was faster than a 680 @1.2 and not many 680's went beyond 1.2 GHz while selling at a mid range price of $309. I estimate to have made $450 mining, had I been more patient, it could have been a few thousand.

AC Unity was the first game for me that demanded heavy compromises for the 7950 and other AMD cards to run it with playable frame rates, later patches improved it a bit. The 970 after that wasn't as impressive, the price was in similar range but the value quickly evaporated.
Posted on Reply
#149
Captain_Tom
Vayra86Eh... what?

www.babeltechreviews.com/hd-7970-vs-gtx-680-2013-revisited/3/

They're about equal, with the 7970 only winning in heavily AMD favored DIRT and at 4K, where both cards produce unplayable frame rates. Meanwhile in Dying Light, the 680 scores a good 15 fps more. Overall the 680 can definitely be considered a better choice as it OC"s better and at the time, 3GB was overkill for most games and resolutions.

At launch, the 680 was overall 10% faster than the 7970. They both age well to be honest. The reason you wouldn't buy the 680 was a different one: price. The much cheaper 670 could do almost as well as the 680.
I genuinely think the 670 was a good card for most of its life, but again I just can't get behind almost any argument for the 680 (Besides it being cheaper and a bit stronger at launch).

But Idk what you are talking about with regards to overclocking. A 7970 at 1250/1850 is a monstrous 35% overclock, it was so high that my card was trading blows with a 970 (when the 970 came out, now it would roflstomp it in Vulcan/DX12 games).

I also checked a benchmark for Dying Light The Following: The 680, 7970, 780, and 7970 GHz all get about the same framerate. But let's not go here because I can find A LOT of benchmarks from the past 2 years that show a 7970 destroying a 680 (in fact the 7870 roughly matches the 680).
Posted on Reply
#150
HammerON
The Watchful Moderator
HammerONI think we are getting off topic here a bit. Please keep on topic.
I posted the above message on page 3 of the this thread. It appears that we are having difficulties staying on topic. Warnings will be given out if this continues.
Posted on Reply
Add your own comment
Apr 26th, 2024 00:13 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts