Tuesday, February 28th 2017

NVIDIA Announces the GeForce GTX 1080 Ti Graphics Card at $699

NVIDIA today unveiled the GeForce GTX 1080 Ti graphics card, its fastest consumer graphics card based on the "Pascal" GPU architecture, and which is positioned to be more affordable than the flagship TITAN X Pascal, at USD $699, with market availability from the first week of March, 2017. Based on the same "GP102" silicon as the TITAN X Pascal, the GTX 1080 Ti is slightly cut-down. While it features the same 3,584 CUDA cores as the TITAN X Pascal, the memory amount is now lower, at 11 GB, over a slightly narrower 352-bit wide GDDR5X memory interface. This translates to 11 memory chips on the card. On the bright side, NVIDIA is using newer memory chips than the one it deployed on the TITAN X Pascal, which run at 11 GHz (GDDR5X-effective), so the memory bandwidth is 484 GB/s.

Besides the narrower 352-bit memory bus, the ROP count is lowered to 88 (from 96 on the TITAN X Pascal), while the TMU count is unchanged from 224. The GPU core is clocked at a boost frequency of up to 1.60 GHz, with the ability to overclock beyond the 2.00 GHz mark. It gets better: the GTX 1080 Ti features certain memory advancements not found on other "Pascal" based graphics cards: a newer memory chip and optimized memory interface, that's running at 11 Gbps. NVIDIA's Tiled Rendering Technology has also been finally announced publicly; a feature NVIDIA has been hiding from its consumers since the GeForce "Maxwell" architecture, it is one of the secret sauces that enable NVIDIA's lead.
The Tiled Rendering technology brings about huge improvements in memory bandwidth utilization by optimizing the render process to work in square sized chunks, instead of drawing the whole polygon. Thus, geometry and textures of a processed object stays on-chip (in the L2 cache), which reduces cache misses and memory bandwidth requirements.
Together with its lossless memory compression tech, NVIDIA expects Tiled Rendering, and its storage tech, Tiled Caching, to more than double, or even close to triple, the effective memory bandwidth of the GTX 1080 Ti, over its physical bandwidth of 484 GB/s.
NVIDIA is making sure it doesn't run into the thermal and electrical issues of previous-generation reference design high-end graphics cards, by deploying a new 7-phase dual-FET VRM that reduces loads (and thereby temperatures) per MOSFET. The underlying cooling solution is also improved, with a new vapor-chamber plate, and a denser aluminium channel matrix.
Watt-to-Watt, the GTX 1080 Ti will hence be up to 2.5 dBA quieter than the GTX 1080, or up to 5°C cooler. The card draws power from a combination of 8-pin and 6-pin PCIe power connectors, with the GPU's TDP rated at 220W. The GeForce GTX 1080 Ti is designed to be anywhere between 20-45% faster than the GTX 1080 (35% on average).
The GeForce GTX 1080 Ti is widely expected to be faster than the TITAN X Pascal out of the box, despite is narrower memory bus and fewer ROPs. The higher boost clocks and 11 Gbps memory, make up for the performance deficit. What's more, the GTX 1080 Ti will be available in custom-design boards, and factory-overclocked speeds, so the GTX 1080 Ti will end up being the fastest consumer graphics option until there's competition.
Add your own comment

160 Comments on NVIDIA Announces the GeForce GTX 1080 Ti Graphics Card at $699

#26
evernessince
SpartanM07$1278 actually with tax and I'm very happy... Sell the Titan X for more than or equal to what I paid on eBay (checkout current selling prices, although it will most likely sell for less meow), buy new 1080 ti, pocket the extra, and all that after I've maxed every game for the past 7 months.
No enthusiast who is looking at a Titan X is going to pay anywhere near what you paid for it now. These are people who follow the news because they want the latest and greatest. Why would they buy a Titan X when they can wait and get a higher performing aftermarket 1080 Ti?

If you look at sold eBay listings, a few Titan X video cards sold for 1k today, which is a pretty big drop. That's considering none of those guys just don't return the cards for a refund. If you wanted to sell your Titan X, you should have sold it yesterday to get full value back.
Posted on Reply
#27
theGryphon
MrGeniusSo...I'm reading the graph backwards? Ok. Then let me draw some straight lines and see where I went wrong.



32.5 dBA @ 88°C
35.5 dBA @ 82°C


Well looky there. The hotter it runs the quieter the cooler is. So the fan slows down and makes less noise as the card heats up. And the fan speeds up and makes more noise as the card cools down. Yep...makes perfect sense!
You're getting the causal direction wrong, which is baffling considering the overall high sophistication in this forum.

In terms of correlation, yes, temperature and dBA are (and have always been, since the Big Bang established the laws of physics and thermodynamics) negatively correlated.

Causal relation here though is NOT that the fan slows down as the heat goes up, it IS that as you slow down the fan the heat goes up.

That's why you were told you're reading the graph backwards...

EDIT: The more direct way to read the graph is that as you increase the fan speed (and hence the dBA) the card runs cooler.
Posted on Reply
#28
theGryphon
evernessinceI think someone at Nvidia messed up...

The X axis is Noise level increasing from left to right.
The Y axis is Temp increasing from bottom to top

For some reason they have the card starting at around 88c with only 32.5 dBA and the noise increasing as temps go up.
No, they did no mess anything up.
Along with my post above, recall that the X axis is typically the control parameter (which here is the fan speed) and the Y axis is the response parameter (which here is the temperature that is realized as response to the fan speed the user decides on).

So, you guys need to learn how to read a graph and reconcile it with the laws of physics and thermodynamics, which I believe everyone here at least has an intuition of ;)
Posted on Reply
#29
kruk
No founders edition tax? Only $699? 11 GB of RAM? Does anybody else feed AMD tricked them into releasing this card so early with their Vega "reveal"? They could keep charging that much for the 1080 and get more profits. Also, people who just bought the 1080 FE or custom models must be really mad :D
Posted on Reply
#30
RejZoR
CammSomewhat inflammatory. As much as I wish AMD would just get Vega the fuck out of the door, its still readily apparent that as Pascal clocks higher, its efficiency per mhz drops with what looks like cache idiling. So a 50% claim is highly overrated.

Oh well, where are the reviews?
The thing is, if you compare RX480 and GTX 1060, the later needs like what, extra 400MHz to match a 1400MHz RX480? Meaning we're dealing with two quite different architectures. NVIDIA's isn't as efficient in terms of raw power per clock and needs to compensate that with really high GPU clocks. It has actually always been like this lately. Even with GTX 900 series, R9 Fury X was what, 1050MHz ? My GTX 980 is running at 1400MHz and it's about matching R9 Fury X in performance. Sometimes. You can't just say uh oh, RX Vega will suck because AMD can't clock it that high. That's kinda irrelevant.

@theGryphon
What NVIDIA is basically saying is the following...

With GTX 1080, it was required to run cooler fan at speeds that create 32.5dB of noise to achieve 94°C and 35.5dB to achieve 87°C.

With GTX 1080Ti, they've achieved same noise levels, but at lower temperatures of 88°C and what's that, 82°C ?

So, technically, they won't make it quieter out of the box, but it can be quieter because they created this temperature gap. You could theoretically lower the fan speed to achieve old temperatures and gain in quieter operation.
Posted on Reply
#31
evernessince
theGryphonNo, they did no mess anything up.
Along with my post above, recall that the X axis is typically the control parameter (which here is the fan speed) and the Y axis is the response parameter (which here is the temperature that is realized as response to the fan speed the user decides on).

So, you guys need to learn how to read a graph and reconcile it with the laws of physics and thermodynamics, which I believe everyone here at least has an intuition of ;)
No, X axis is clearly labeled dBA. If they were going to include 3 variables they should have just done a 3d graph. How you assume people would draw the conclusion that there is a 3rd unmentioned variable without prior context is the baffling part. This graph was obviously never meant to be taken without the fan context you have provided.
Posted on Reply
#32
EarthDog
OneCool11gb of VRAM screams something isn't right...W1z...Come on back my old ass up here.... Core math doesn't hold up on this lol...2+2 isn't almost 4
That's why there is the odd bus width and back end. ;)
krukNo founders edition tax? Only $699? 11 GB of RAM? Does anybody else feed AMD tricked them into releasing this card so early with their Vega "reveal"? They could keep charging that much for the 1080 and get more profits. Also, people who just bought the 1080 FE or custom models must be really mad :D
Perhaps they want to get it out before Vega for sales without competition. I have to imagine Vega will be as fast as a Titan XP/1080ti
Posted on Reply
#33
qubit
Overclocked quantum bit
A slightly crippled GPU and a weird 11GB RAM on their top GTX? Now that's just fugly. :shadedshu: I'll wait for the reviews and Vega before buying, but this puts me off the card and might just stick to a 1080. The thing was plenty fast anyway.

It wouldn't surprise me if NVIDIA release something like a 2080 Ti with the full GPU and 12GB RAM + higher clocks when Vega comes out, for significantly better performance. They might then be able to hike the price, too...
Posted on Reply
#34
Aenra
krukAlso, people who just bought the 1080 FE or custom models must be really mad :D
Naah...
For starters, we knew there'd be a Ti version since like forever. Also, this was meant to originally happen in January or thereabouts if you recall.. So the people buying a 1080 the last few months (myself included) knew everything they needed to know. Either they could not wait, or they had decided in advance that the money recquired was outside their budget/sense of reason.
(don't pay attention to the price tags mentioned here.. don't even pay attention to the price tags + Vat. You need add customs [a world exists outside the US] and you need add the extra bucks charged for the 'improved' models that will be coming out*. For millions of people, this card translates to about 1.5k)

* in case you're too far gone into the techie side of the force, most folks don't give a rat's behind about founder editions. They will go buy the EVGA/Gigabyte/Asus model that will be 20-25% faster and signatures be damned :)
Posted on Reply
#35
Prima.Vera
To be honest, even 8GB of VRAM would have suffice. They used 11 I think, not that because of games reqs (overkill), but in order to increase the BUS width to 11x32=352bits, instead of 8x32=256 (1080).
Posted on Reply
#36
Patriot
ShurikN"While it features the same 3,584 CUDA cores as the TITAN X Pascal, the memory amount is now lower, at 11 GB, over a slightly narrower 352-bit with GDDR5X memory interface. This translates to 11 memory chips on the card."
Nice hack job
Well... I guess they didn't want to get sued again... They still haven't paid out the settlement cost they agreed upon.
Posted on Reply
#37
theGryphon
evernessinceNo, X axis is clearly labeled dBA. If they were going to include 3 variables they should have just done a 3d graph. How you assume people would draw the conclusion that there is a 3rd unmentioned variable without prior context is the baffling part. This graph was obviously never meant to be taken without the fan context you have provided.
Yes, dBA, which is very clearly a proxy parameter for fan speed. I mean, given the context, it is just too obvious don't you think? So, why would they need a 3rd graph? They assumed, which they should, whoever reads this graph would understand automatically that dBA stands for the sound levels coming out of the card...

I mean, seriously, I won't waste my time on this anymore.
Posted on Reply
#38
ZoneDymo
fynxerVEGA is fucked!!! 1080Ti with OC is over 50% faster than stock 1080, VEGA is blown out of the water before it even hits the market. All this at $699, just saying, good luck AMD.

AMD waited to long with VEGA and will now pay the ultimate price.
man, rarely seen so much fanboy in one post...
Posted on Reply
#39
evernessince
theGryphonYes, dBA, which is very clearly a proxy parameter for fan speed. I mean, given the context, it is just too obvious don't you think? So, why would they need a 3rd graph? They assumed, which they should, whoever reads this graph would understand automatically that dBA stands for the sound levels coming out of the card...

I mean, seriously, I won't waste my time on this anymore.
I'm sorry, I just didn't make the assumption that RPMs are the exact same between the two cards. If you read the graph with the assumption that all fan variables are the same then it works fine.
Posted on Reply
#40
chr0nos
maybe another gtx970 memory fiasco :wtf:
Posted on Reply
#41
johnspack
Here For Good!
Wow, only 1000can? I'll take 2!
Posted on Reply
#42
RejZoR
evernessinceI'm sorry, I just didn't make the assumption that RPMs are the exact same between the two cards. If you read the graph with the assumption that all fan variables are the same then it works fine.
If you're hitting same noise levels, the chances are, fan speed is identical. Don't you think? In terms of noise, this just means you can lower the fan speed, achieving old temperatures, but lower noise. You can't have both at once unless you pick a temperature and fan noise half way through both axis... In that case, you'd make it tiny bit quieter and tiny bit cooler than the old GTX 1080.
Posted on Reply
#43
petedread
I would have been excited about this card had I not bought a 980ti classified on release. The card has put me off Nvidia. Having to lower game settings 3 or 4 times now since I bought it. Fallout 4 was fantastic at 4k to start with, admittedly I could not play at max settings to start with but now I have so many settings turned down or off. I know some people are talking about Nvidia hampering performance on cards, but I am not interested in anybody else's experience. My own experience with Fallout 4 and Dying Light has shown me. I will buy the top Vaga card regardless of how it performs compared to Nvidia's offerings. Instead of performance that goes down, I will have a card with performance that goes up over time lol.
Posted on Reply
#44
ratirt
Strange this card may be( yeah I watched star wars yesterday :P ) I'm waiting for benchmarks but still how they designed the card is weird indeed. Somebody mentioned that NV releases TI version now cause they want to get some shiny penny outta it? I think that's quite right. I think NV was pretty in a hurry to release it. Let's wait and see what this card can do :)
Posted on Reply
#45
the54thvoid
Intoxicated Moderator
petedreadI would have been excited about this card had I not bought a 980ti classified on release. The card has put me off Nvidia. Having to lower game settings 3 or 4 times now since I bought it. Fallout 4 was fantastic at 4k to start with, admittedly I could not play at max settings to start with but now I have so many settings turned down or off. I know some people are talking about Nvidia hampering performance on cards, but I am not interested in anybody else's experience. My own experience with Fallout 4 and Dying Light has shown me. I will buy the top Vaga card regardless of how it performs compared to Nvidia's offerings. Instead of performance that goes down, I will have a card with performance that goes up over time lol.
You break your PC? My 980ti hasn't crippled itself downwards...
Posted on Reply
#46
petedread
fynxerVEGA is fucked!!! 1080Ti with OC is over 50% faster than stock 1080, VEGA is blown out of the water before it even hits the market. All this at $699, just saying, good luck AMD.

AMD waited to long with VEGA and will now pay the ultimate price.
It does not matter how Vaga performs compared to Nvidia cards. I just want a card that does what I want it to do. I can not wait to replace my 980ti.
Posted on Reply
#47
R0H1T
chr0nosmaybe another gtx970 memory fiasco :wtf:
Watch this space for more :D
johnspackWow, only 1000can? I'll take 2!
That'll be 2k plus taxes, if any :shadedshu:
Posted on Reply
#49
petedread
the54thvoidYou break your PC? My 980ti hasn't crippled itself downwards...
LoL, Like I said, my personal experience has put me off.
Posted on Reply
#50
MrGenius
Oh for crying out loud. How hard can this be to understand people? They made a mistake. Plain and simple.

Here. I fix.



How difficult was that?
Posted on Reply
Add your own comment
Apr 24th, 2024 19:53 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts