• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces the GeForce GTX 1080 Ti Graphics Card at $699

Oh for crying out loud. How hard can this be to understand people? They made a mistake. Plain and simple.

Here. I fix.

cooler performance corrected.PNG


How difficult was that?
 
Last edited:
Oh for crying out loud. How hard can this be to understand people? They made a mistake. Plain and simple.

Here. I fix.

View attachment 84657

How difficult was that?

I think your mom made a mistake lol. Their graph is completely fine, the dBA correspond to the fan duty cycle, that mean at constant graphical load (220W as in the graph), increasing fan duty cycle reduces GPU temperature.
 
Last edited:
Yeah...I know. Fans get louder as their speed increases. Higher dBA = louder. Common knowledge. Therefore the graph is completely and totally wrong.
 
I think your mom made a mistake lol. Their graph is completely fine, the dBA correspond to the fan duty cycle, that mean at constant graphical load (220W as in the graph), increasing fan duty cycle reduces GPU temperature.

Makes that "MrGenius" username kinda ironic now doesn't it :p

Yeah...I know. Fans get louder as their speed increases. Higher dBA = louder. Common knowledge. Therefore the graph is completely and totally wrong.

You do realize that in your "improved" graph the fans spin down, aka make less noise while also reducing the temperature of the card somehow more then when there were spinning faster right?
 
Last edited by a moderator:
Just came by to add 2 lines of information not included in the TPU news, forgiveme if I am wrong:

- No DVI port: a nice clean backpanel :)

- Founders Edition available next week :)


BTW: "Founders edition" sounds soooo Karl May to me... (maybe only Germans will understand... :-) or let's say "The last of the mohicans"
 
You do realize that in your "improved" graph the fans spin down, aka make less noise while also reducing the temperature of the card somehow more then when there were spinning faster right?
Not necessarily, if the default fan curve is set aggressively the fan speed ramps up quickly depending on the gpu load & temperature. If it's (relatively) conservative then you'll see a flat lining wrt noise & how the fan speed isn't lowered as quickly with the gpu load/temp drop. He's right btw IMO, someone goofed up the original graph.
 
That bus and core count seems not to match. Like everything else. Also. 352-bit bus and 11GB of memory. I think this will be a 970-like GPU, being unable to use some of it's potential.
 
Just came by to add 2 lines of information not included in the TPU news, forgiveme if I am wrong:

- No DVI port: a nice clean backpanel :)

- Founders Edition available next week :)


BTW: "Founders edition" sounds soooo Karl May to me... (maybe only Germans will understand... :) or let's say "The last of the mohicans"

It's rather Old Shatterhand.
 
He's right btw IMO, someone goofed up the original graph.


You guys are embarrassing yourselves. Maybe today, maybe tomorrow, and I hope one day, you'll look at what you've written and slap yourselves while looking into the mirror.

I don't have anything else to say, other than 'wow'... :shadedshu:
 
The thing is, if you compare RX480 and GTX 1060, the later needs like what, extra 400MHz to match a 1400MHz RX480? Meaning we're dealing with two quite different architectures. NVIDIA's isn't as efficient in terms of raw power per clock and needs to compensate that with really high GPU clocks. It has actually always been like this lately. Even with GTX 900 series, R9 Fury X was what, 1050MHz ? My GTX 980 is running at 1400MHz and it's about matching R9 Fury X in performance. Sometimes. You can't just say uh oh, RX Vega will suck because AMD can't clock it that high. That's kinda irrelevant.
gtx1060 is ~15% smaller with ~25% less transistors. also, at pascal launch nvidia was talking about having to specifically alter the architecture to make it clock higher, so that clock speed is not just coming out of nowhere. they even admitted that pascal might have lost a bit of ipc in the process compared to maxwell.

and the clock speed difference is even more than 400mhz. rx480 runs at 1266mhz boost and gtx1060 tends to stay above 1800mhz for all but excessive load.

furyx-s clocked to around 1.15-1.2ghz. your gtx980 at 1.4ghz is already pretty good, i think on average they went to 1.4-1.45ghz. i am surprised it'd match furyx at that clock though, fury should still beat it in most cases. fiji is simply much larger.
 
Last edited:
On Fury X launch, they were about neck on neck. Later, as drivers matured, R9 Fury X pulled ahead. And even more so in D3D12 and Vulkan where it just obliterates it. It's why I tiny bit regret I've gone with GTX 980. I mean, it's not a bad card, but Fury X is just better, despite certain other limitations.
 
(off topic, but..)

Why the joy at no DVI?
I still use the DVI connection in mine and never mind DP, mini or proper.
Knowing full well in advance how each and every one of you here is going to come and correct me, lol, i can tell you that for visual signal only?

If my DVI gives me a 100%, the DP connection gives me about 75-80%. A bit fuzzier, a bit less crisp, the whites not so intense as they were with DVI; like someone took the contrast down a notch or two. No, i didn't forget to set the monitor right (choose PC rather than 'TV'), no i didn't forget to set the frequency to 144; yes, said comparison with settings default, no calibrating; before or after.
DVI-d is a piss poor Chinaman cable that came bundled with my monitor (one in my specs). DP cable cost a fortune, we're talking big brand used in high-end audiovisual equipment. Ended up giving it away.
 
Last edited:
Seriously guys, please learn to read a graph properly.

As said before, generally X-axis is the control parameter (the source, input) and Y-axis is the response parameter (the result, output). Changing parameter in X results in a change in Y parameter.

In this case the graph depicts COOLER PERFORMANCE at dissipasing 220 Watt of heat. Here the performance of a cooler is characterized by how cool the temperature and how quiet the cooling system is.

For a given cooler design (there's two in the graph, 1080 cooler and 1080 Ti cooler) and constant heat load, the resultant junction temperature (Y-axis) is correlated to the noise generated by the cooling system (here it is assumed that noise is correlated to fan speed).
- By varying the noise (fan speed) the resultant temperature will vary too.
- Take sample at low noise (low fan speed) : with 32.5 dB noise, 1080 cooler results in ~95C, 1080 Ti cooler results in 88C.
- Take sample at high noise (high fan speed) : with 35.5 dB noise, 1080 cooler results in ~87C, 1080 Ti cooler results in 82C.

- Notice that as fan speed goes up (noise increaased) the temperature lowers, intuitive isn't it?
- Notice at the same noise level, 1080 Ti cooler results in lower temperature than 1080 cooler, this is the point that the graph is trying to get across.

I sometimes wonder if this kind of error is the cause of some baffling decision from governments and companies.
 
You're move AMD.
You're move AMD.
You're move AMD.

Seriously, it's you're move AMD.

Anyway, looking forward to seeing custom AIB cards/reviews.
 
You're move AMD.
You're move AMD.
You're move AMD.

Seriously, it's you're move AMD.

Anyway, looking forward to seeing custom AIB cards/reviews.

Someone's going to pull up the 'you're' usage. May as well be me!

Your move Fluffmeister.
 
Looks very good. But unfortunately its price may be $1,375 in Turkey. Tax policies are not good in here. Damn!
 
Hmm 11Gbps gddr5x, I thought that micron skipped those faster gddr5x for gddr6. But now they are back on their site.

Price is surprisingly good, so how long it takes them to release titan X black with full gp102 and maybe 12Gbps gddr5x.
 
the reason we saw weird memory configuration here is because
its has 6 ROP disabled, we still know that ROP tied with memory controller like GTX 970 was
which 4 ROPs disabled in GTX 970 causing last 32 bit portion need to be 'hike' along with
another ROP which connected to Memory controller (3.5+0.5 GB) IIRC

Now Nvidia learn their lesson, to avoid GTX 970 fiasco, they simply completely disabled last 32 bit MC along with 6 ROP
hence thats what you see now 11 GB via 352 bit bus, if they attempt to 'normalize' 12 GB (11+1) advertising, it will revive old stigma to them .
 
Therefore the graph is completely and totally wrong.
LOL at the wrong graph conspiracy ... your "fixed" graph says that somehow if you make your card more quiet (by lowering the fan speed), it will also drop temps :laugh: maybe in the bizzaro universe o_O
 
i wonder if that improved cooling graph is only due to removing the dvi slot. that should give good 40-50% extra exhaust space which is what blowers live by...

in general though, 1080ti is better than expected. with very few details, rumors were around more cut down chip, using gddr5 and higher price. none of that came to pass. performance close to titan xp is not bad at all.
 
Why the joy at no DVI?
I still use the DVI connection in mine and never mind DP, mini or proper.
Knowing full well in advance how each and every one of you here is going to come and correct me, lol, i can tell you that for visual signal only?

If my DVI gives me a 100%, the DP connection gives me about 75-80%. A bit fuzzier, a bit less crisp, the whites not so intense as they were with DVI; like someone took the contrast down a notch or two. No, i didn't forget to set the monitor right (choose PC rather than 'TV'), no i didn't forget to set the frequency to 144; yes, said comparison with settings default, no calibrating; before or after.
DVI-d is a piss poor Chinaman cable that came bundled with my monitor (one in my specs). DP cable cost a fortune, we're talking big brand used in high-end audiovisual equipment. Ended up giving it away.
wuh...what? What's this 100% 80% nonsense you are talking?

DP is a digital signal. It either getside there or doesnt. I have some of the cheapest DP cables I could find driving a 4k and 2460x1440 monitor...DP is equal or better in every way last I understood.

That bus and core count seems not to match. Like everything else. Also. 352-bit bus and 11GB of memory. I think this will be a 970-like GPU, being unable to use some of it's potential.
It matches. Math above you.
 
Remarkable progress in setting up the outputs will be greatly improved cooling. It is best for the Micro ATX to get more than SLI with the use of liquid cooling block. That only one slot is all together. Nicely. Given the 80% better utilization of the RAM will not be hurt with 1 gb less !However, the price will be higher because 22% due tax , and special versions will add another 100 $ to 200$. 900€ I deducted for GTX1080 with block on (MSI EK). GTX1080Ti will be 1000 €. I'm good with a pair of GTX1080 with 1440x3440 Ultra play and enjoy.
 
Back
Top