• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Announces the GeForce GTX 1080 Ti Graphics Card at $699

Seriously guys, please learn to read a graph properly.

As said before, generally X-axis is the control parameter (the source, input) and Y-axis is the response parameter (the result, output). Changing parameter in X results in a change in Y parameter.

In this case the graph depicts COOLER PERFORMANCE at dissipasing 220 Watt of heat. Here the performance of a cooler is characterized by how cool the temperature and how quiet the cooling system is.

For a given cooler design (there's two in the graph, 1080 cooler and 1080 Ti cooler) and constant heat load, the resultant junction temperature (Y-axis) is correlated to the noise generated by the cooling system (here it is assumed that noise is correlated to fan speed).
- By varying the noise (fan speed) the resultant temperature will vary too.
- Take sample at low noise (low fan speed) : with 32.5 dB noise, 1080 cooler results in ~95C, 1080 Ti cooler results in 88C.
- Take sample at high noise (high fan speed) : with 35.5 dB noise, 1080 cooler results in ~87C, 1080 Ti cooler results in 82C.

- Notice that as fan speed goes up (noise increaased) the temperature lowers, intuitive isn't it?
- Notice at the same noise level, 1080 Ti cooler results in lower temperature than 1080 cooler, this is the point that the graph is trying to get across.

I sometimes wonder if this kind of error is the cause of some baffling decision from governments and companies.
Seriously, I'd expect anyone on this site to be able read the graph.

How about adding something like *at constant GPU load. If Nividia were really smart they would've simply put the noise delta on the X axis otherwise this is what people can interpret & they're not wrong either ~

9jc17nz.png
 
Last edited:
the reason we saw weird memory configuration here is because
its has 6 ROP disabled, we still know that ROP tied with memory controller like GTX 970 was
which 4 ROPs disabled in GTX 970 causing last 32 bit portion need to be 'hike' along with
another ROP which connected to Memory controller (3.5+0.5 GB) IIRC

Now Nvidia learn their lesson, to avoid GTX 970 fiasco, they simply completely disabled last 32 bit MC along with 6 ROP
hence thats what you see now 11 GB via 352 bit bus, if they attempt to 'normalize' 12 GB (11+1) advertising, it will revive old stigma to them .

Not exactly. You are correct that the ROPs are tied with the memory controller. However, nVidia didn't actually disable any ROPs on the GTX970. The GTX970 still had all 64 ROPs enabled, because it had all 8 memory controllers enabled.
It was disabling the L2 that caused the issue. Each memory controller, and it's ROPs, are linked to a block of L2. When they disabled a block of L2 in the GTX970, that block's memory controller and ROPs had to be jumpered over to another block of L2.
The ROPs in the jumpered section were technically still active. However, nVidia designed their driver to not use them, because using them would have actually resulted in slower performance.

In the case of the GTX1080Ti, they likely also lowered the amount of L2. We won't know for sure, because L2 is not an advertised spec. And you are probably right, in this case, they also just went ahead and disabled the memory controller and it's associated ROPs to avoid any kind of fiasco.

wuh...what? What's this 100% 80% nonsense you are talking?

DP is a digital signal. It either getside there or doesnt. I have some of the cheapest DP cables I could find driving a 4k and 2460x1440 monitor...DP is equal or better in every way last I understood.

The only way I see his statement making sense is if he use using a DP -> DVI adapter. I've had some of those really suck.

But he's really complaining about nothing for two reasons:

1.) This is just the reference output design. AIBs can change it however they want, and I'm sure some will add a DVI port.
2.) It has an HDMI port. Since DVI and HDMI use the exact same signal, he can just pick up a cheap HDMI -> DVI adapter or cable.

A slightly crippled GPU and a weird 11GB RAM on their top GTX? Now that's just fugly.

Just like so many great GPUs before it.

Seriously, I'd expect anyone on this site to be able read the graph.

How about adding something like *at constant GPU load. If Nividia were really smart they would've simply put the noise delta on the X axis otherwise this is what people can interpret & they're not wrong either ~

It doesn't matter which axis is which. The graph would read the same. However, the point nVidia was making was that the 1080Ti cooler gives lower temperatures than the 1080 cooler. So, most people expect the 1080Ti line to then be lower on the graph than the 1080. Not shifted a little to the left. Visually, if you're point is that something is lower than something else, you orient your graph axes so that it is visually lower on the graph.

And they did put, in clear as day letters, that they were testing both at 220w.
 
Last edited:
After the "do you get lower room temperatures with a bigger cpu cooler?", the new TPU complex science challenge is understanding the nvidia cooler graph. Seriously i can even understand what people isn't understanding.

On the cooler subject, id like to point that there are 2 things that favors the 1080 ti cooling over the 1080:

1. No dvi port - increased flow, lower turbulence.
2. Bigger die area - higher heat transfer at same power.

I bet those two make up for the most of this 5 C diference at the same power. And nividia changed nothing or almost nothing.
 
Last edited:
Says 699 in what you linked.. and is correct?

Edit: you meant the 1080...in a 1080ti thread.... and didn't say it. LOL!
 
Last edited:
So 35% stronger than the 1080? It's just a Titan with a different name.


Vega should have no trouble defeating this if they want it to.
 
Yeah Vega is a beast, at least 50% faster I've heard, should cost only $400 too and come with a free HTC Vive.

Nvidia are doomed.
 
After the "do you get lower room temperatures with a bigger cpu cooler?", the new TPU complex science challenge is understanding the nvidia cooler graph. Seriously i can even understand what people isn't understanding.

On the cooler subject, id like to point that there are 2 things that favors the 1080 ti cooling over the 1080:

1. No dvi port - increased flow, lower turbulence.
2. Bigger die - higher heat transfer at same power.

I bet those two make up for the most of this 5 C diference at the same power. And nividia changed nothing or almost nothing.

I think the interesting thing is that everyone is arguing about the reference cooler. Something almost none of us will sue because it's still crap compared to the 3rd party coolers that will be used by the AIBs.
 
i would not call it exactly crap. there are tradeoffs to be made with cooler having to be a blower.
there is not much you can do to a blower type cooler beyond what nvidia currently has on 1080/1080ti/titanxp. at least not in reasonable price range.
 
If anyone honestly thinks the price is due to anything than wanting to capture marketshare and competition even from older cards and AMD.

Stop being delusional.

Also, it looks amazing, when are the actual reviews out?
 
Yeah Vega is a beast, at least 50% faster I've heard, should cost only $400 too and come with a free HTC Vive.

Nvidia are doomed.

Do you realize how silly you sound? You are acting like it's insane that after 2 years AMD could make a card 50% stronger than their previous flagship.


They have done that every generation lol.
 
Do you realize how silly you sound? You are acting like it's insane that after 2 years AMD could make a card 50% stronger than their previous flagship.
They have done that every generation lol.
you are right that amd should be able to do 50% over previous flagship.
however, amd's previous flagship was fury x. 50% on top of fury x would put performance at only slightly faster than gtx1080.

depending on where exactly they want vega to be it might not be enough.
 
11gb of VRAM screams something isn't right...W1z...Come on back my old ass up here.... Core math doesn't hold up on this lol...2+2 isn't almost 4

With modern math AKA common core math it certainly is! Probably more like -14.5 with where our math education is going. And to the 2+2 that probably = wtf you want it to!
 
Do you realize how silly you sound? You are acting like it's insane that after 2 years AMD could make a card 50% stronger than their previous flagship.


They have done that every generation lol.

In his sarcasm I bet he meant 50% faster than 1080Ti.

In all seriousness, AMD should be able to do way more than +50% over Fury X, which came with a gimped HBM infrastructure. My bet though, top Vega won't beat 1080Ti, but come to 10% neighborhood.
 
Yes...yet no. As you pointed out they didn't technically screw up the graph or get it "wrong" they show proper relationship of noise of cooler and temperature. So yes I was just being funny and sarcastic, but as others' have stated it isn't an incorrect graph necessarily but it is poorly executed with parameters not set or stated like fan speed etc which would have stopped everyone from questioning it. So yes the graph seems accurate I get what it is trying to say, but it did take me a moment to look at it and see what they were doing. It works, just could have been better executed.
 
noise level and fan speed would be on the same axis and graph would largely be the same. maybe they chose noise as it drew more straight lines?

now that i looked closer at that temp/noise graph though, 1080@220w and 1080ti@220w is misleading as hell. 1080 reference tdp is 180w, 1080's has 250w. even with a better cooler the actual end result will be worse
 
noise level and fan speed would be on the same axis and graph would largely be the same. maybe they chose noise as it drew more straight lines?

now that i looked closer at that temp/noise graph though, 1080@220w and 1080ti@220w is misleading as hell. 1080 reference tdp is 180w, 1080's has 250w. even with a better cooler the actual end result will be worse
Of course, news at 11 the 180W TDP cooler is less efficient/more noisy than 220W TDP cooler.
This is what many are ignoring, also the reason why the graph didn't make sense to me at first, not to mention I couldn't recall 1080's TDP immediately.
 
Last edited:
Yes...yet no. As you pointed out they didn't technically screw up the graph or get it "wrong" they show proper relationship of noise of cooler and temperature. So yes I was just being funny and sarcastic, but as others' have stated it isn't an incorrect graph necessarily but it is poorly executed with parameters not set or stated like fan speed etc which would have stopped everyone from questioning it. So yes the graph seems accurate I get what it is trying to say, but it did take me a moment to look at it and see what they were doing. It works, just could have been better executed.
The graph is perfect. RPM value is meaningless, what matters is noise levels and cooling performance, both shown on the graph. But, as I said before, is not an apples to apples comparison because of the difference in die area and outlet design. So I'm not buying the "better cooler" claim.
 
Last edited:
throw a water block on it and you get a single slot card!!! FINALLY!! the day of DVI is over!
 
throw a water block on it and you get a single slot card!!! FINALLY!! the day of DVI is over!

It's not. AIB's are 99% likely to put one on there, because the vast majority of buyers have DVI monitors. Yes you can use an adapter, but they won't want to alienate buyers.
 
Probably 11GB due to the die cuts directly effecting memory interface and Nvidia trying to avoid a much slower 12th GB.
 
I love official graphs for this stuff, always making minor differences look huge :laugh:. The percentage comparison at the end is solid gold :roll:. It'll be interesting to see what the non-reference boards manage.
 
Back
Top