• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GTX 1080-successor a Rather Hot Chip, Reference Cooler Has Dual-Fans

Makes sense to me. The Tensor cores in Volta are not things NVIDIA wants to waste fab space on for gamers. Turing always made sense considering the delay between Pascal and now. NVIDIA pushing RTX also fits NVIDIA's modus operandi.
Totally, I think it will also be the same node of the current gen, which is fitting with the subject of this threads' OP.
 
RTX looks more and more like fringe technology to me, something that can't be used extensively for proper results. Reminds me a lot of tessellation, we got bombarded with it when DX11 arrived, you would have been convinced that in 3-4 years every surface rendered in a game would be tessellated. Fast forward to today and it's still used sparingly as it's cost still far outweighs the results and it likely remain that way , stuff like parallax occlusion mapping proved to be much more feasible across all types of hardware and platforms.

Integrating that into hardware and building an entire product line based on it ? That might be too much even for Nvidia.
 
Totally, I think it will also be the same node of the current gen, which is fitting with the subject of this threads' OP.
Exactly. Probably 12nm which means Turing chips are big, hot, and hungry.
 
Exactly. Probably 12nm which means Turing chips are big, hot, and hungry.

But hopefully fast, and as we all know heat and power consumption are a none issue these days apparently.
 
Not when the sticker is green.

giphy.gif
 
RTX looks more and more like fringe technology to me, something that can't be used extensively for proper results. Reminds me a lot of tessellation, we got bombarded with it when DX11 arrived, you would have been convinced that in 3-4 years every surface rendered in a game would be tessellated. Fast forward to today and it's still used sparingly as it's cost still far outweighs the results and it likely remain that way , stuff like parallax occlusion mapping proved to be much more feasible across all types of hardware and platforms.

Integrating that into hardware and building an entire product line based on it ? That might be too much even for Nvidia.
Lighting is not a local thing, it's scene-wide and much more likely to yield visible results. That said, at the end of the day it's still just a tool, so it very much depends on how you use it. That, and if history repeats itself, only the second hardware iteration will be able to handle it properly (but that's just an assumption on my part).

Also, if you truly believed that about tessellation, you must have missed TruForm before it.

Not when the sticker is green.
Or blue?
 
most people don't give a shit as long as its perceptualized as the "fastest" gpu. this will sell like "hot" cakes no matter its actual specs and abilities just because nvidia is a "trusted" brand in the eyes of the average consumer. this will be the fastest gpu tho and probably the hottest if nvidia is trying to put in it the "compute" abilities that previous gens didn't get. OR as mentioned previously nvidia is trying to outsell its AIB vendors by implementing such a design. OR its a combination of both. regardless, atm all this is speculation, we shall see whats what in due time.
 
I would just wait for proper reviews where retail-ready samples are tested & see if these new hot running chips are worth the upgrade or not. Until then, I'll hold onto my wallet.
 
Last edited:
  • Like
Reactions: bug
I would just wait for proper reviews where retail-ready samples are tested & see if these new hot running chips are worth the upgrade or not. Until then, I'll hold onto my wallet.
Worth the upgrade is relative. If you own another card from last generation's same tier, it's usually not worth it. If you own something older or are looking to jump up a tier, it's usually worth it.
 
Irony, double standards, it's all comedy gold frankly.
Right, it was "somebody else" not bentoverbackwards team greens calling cards consuming 250W "power hogs" and 50-70W difference in power consumption "huge".

I'm preparing popcorn for the price justification talks.
 
Right, it was "somebody else" not bentoverbackwards team greens calling cards consuming 250W "power hogs" and 50-70W difference in power consumption "huge".

I'm preparing popcorn for the price justification talks.

I guess AMD shouldn't have made that stupid video mocking Fermi's power consumption in the first place, it was always going to come back and bite them in the arse.

Also, people moaning about the price of Nvidia cards is a given.
 
I guess AMD shouldn't have made that stupid video mocking Fermi's power consumption in the first place

Making fun of something that was actually true and relevant is stupid now huh ? Figured as much, comedy sure has gotten weird over the years.
 
Right, it was "somebody else" not bentoverbackwards team greens calling cards consuming 250W "power hogs" and 50-70W difference in power consumption "huge".

I'm preparing popcorn for the price justification talks.
So your problem is what, exactly?

Making fun of something that was actually true and relevant is stupid now huh ? Figured as much, comedy sure has gotten weird over the years.
It's not stupid, it just goes to show even AMD agrees power draw is important. What puts them in a bad light is they lost the power efficiency crown right after that hardware generation and never got it back.
 
It's not stupid, it just goes to show even AMD agrees power draw is important. What puts them in a bad light is they lost the power efficiency crown right after that hardware generation and never got it back.

Bingo, at least someone gets it.
 
Don't worry, everybody gets it. It's just more convenient to play dumb when you don't like it ;)

Yep, so in short medi01 is upset about green teams members calling cards "power hogs" and highlighting 50-70W difference in power consumption as "huge".

Yet it was his beloved that made it a pissing contest in the first place.

I make as sarcastic comment about power consumption not being important apparently.... and you know the rest.
 
Yep, so in short medi01 is upset about green teams members calling cards "power hogs" and highlighting 50-70W difference in power consumption as "huge".

Yet it was his beloved that made it a pissing contest in the first place.

I make as sarcastic comment about power consumption not being important apparently.... and you know the rest.
It's such a useless discussion, Idk why it gets brought up so often. Yes, Nvidia had Fermi (and even FX5000 before that). Yes, AMD has fallen back since their failure to implement TBR. But other than that, they were pretty close to each other. So usually you can pick from either camp. But when the difference widens, it makes sense to warn potential buyers. That is all. The only idiotic thing here is trying to make it look like only one of the players can fall behind. More idiotic to keep reiterating that.
 
Last edited:
Remember that Titan V gets a lot of power efficiency from HBM2 which uses about a third of the same power GDDR5 uses for the same performance, and doesn't have a significant increase in amount of cores. So HBM2 efficiency covers the computing cores increase.

...Which means that Titan V and Titan xp perf/watt are roughly same.


Memory power draw is a small % of the total power draw of the graphics card. The GPU itself draws most of the power.
 
For what it's worth, the PCB for the Founder's GTX 1080 had solder points for a 6-pin PCI-E power connector in addition to the 8-pin that was used.

And while the PCB has two fan headers doesn't mean both will be used. Definitely possible, yes. But it's folly to assume this is definitely happening.
 
Back
Top