• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA RTX 4080 Rumored To Feature 420 W TDP

Whilst true, I'm far more interested in seeing better optimised games than blindly trying to brute-force it with unwanted space-heaters. The other half the of "My GPU isn't powerful enough for this game" equation is "this game dev can't optimise for sh*t"...

Edit: Looking back on the past 35 years of PC gaming, I remember being blown away in the late 90's to mid 2000's when games like Thief, Half Life and FEAR came out. Not just the graphics, but the whole package, eg, The Dark Engine's audio propagation physics on an Aureal 3D sound card, enemies communicating with each other in FEAR in an actually believable way, etc, and thinking damn what are games going to be like in 25 years time. 25 years on, and advances in these areas are flat as hell. Everything is solely about graphics and even that's starting to experience the same depreciating excitement issue as Hollywood CGI, ie, once you've seen the 300th similar looking photo-realistic clone, it's no longer that 'interesting' anymore and you start craving stuff that actually looks and feels different, eg, Dishonored's water-colour or Bioshock's underwater art-deco styles. I'd take more of these type of games that run exceptionally well on an average GPU anyday than the 'need' for a 1500w RTX 6090Ti Super Titan for Crysis 6's 7hr long 'content'...
For open world there was GTA - SA big upgrade over previous 2 3D GTA games, then there were games like Quake, Doom(upto 3 and 3 also scaled well with SLI), etc... and games back were real purchases unlike todays rent system thanks to Steam, Epic and others.
 
You're assuming AMD and Nvidia will want to drag two engines with them forever. They won't. They'll find a breaking point and go full RT. But again, 10 years is a (very) optimistic estimation for when that will happen.
No one's forcing them to now, eg, AMD doesn't have Tensor cores separate from the regular ones and clearly still aren't losing out in other related areas (FSR 2.0). That nVidia approached the situation by having dedicated cores only a few games use still doesn't make it 100% future usage some market requirement anymore than it did for "nVidia acquired Ageia therefore 100% of games will go full PhysX". And if "RT-only GPU's" broke compatibility with 99% of my +2,000 PC game collection simply because someone else was overly-obsessed with ray-tracing, well, that's a rather out of touch thing to argue for to be honest. The entire point and #1 killer feature of PC is backwards / forwards compatibility, otherwise save your money and just go buy The Console Of The Day...
 
Last edited:
Somebody from gov should regulate this nonsense. This is getting ridiculous. A GPU that consumes more then a standard PSU from a couple of years ago..

they should have stopped crypto before it even got started. but they can leave my fucking gaming hobby alone. unless they can be come competent enough to ban all crypto from businesses and banks, they need to step the fuck back from my gaming and my air conditioner usage.
 
That's not how it works. Excess energy is converted to heat, but heat on its own does not contribute to global warming ;) Read a bit on the greenhouse effect.
as i know in cpu/gpu there is no excess energy ; total used=energy for computation + wasted due leakage which is inherent and unavoidable ; the current flow/electron movement causes the heat ; increased work = more power is needed+more leakage appear ....results in more combined heat.... heat which also increase leakage, which generate more heat and all is adding up to maximum which can be handled by the chip; this is not excess but only what is needed to work properly considering actual technology
 
Don't agree with anything in this so long post. Everything that you just posted is wrong.

420 watts for a mid-range card is unacceptable and ugly by nvidia.
4080 won't be high-end. There will be 4090, 4090 Ti and allegedly a Titan-class flagship.

Yes, climate change is indeed influenced by the energy use - transport, industry, etc. which burn polluting coal, oil and fossil gas.


Let's hope AMD's new 7000 cards are ok with power consumption (look at the thread above which states 750-watt and 650-watt PSU recommended), and with this generation nvidia will lose market share.

And no, 1080p gaming is crap, move on to 2160p - better and nicer.

You...seem to be a few steps behind.

The TI branding is a refresh nomenclature. As in, after enough time on the market, and with enough process refinement, they can push out a new revision that is a bit better. That is to say that the 4080 is replaced by the 4080 TI...as such naming it that way makes no sense. You're trying to get to EoL without taking any time to frame the rest of the product lifecycle, which is less than honest.

On top of this, 4050-4060-4070-4080-4090. The Titan, or equivalent, is not part of the regular product stack. The 4080 is therefore 1 from the top, and 3 from the bottom. Exactly how is this a "middle range" product? I ask rhetorically, because by mathematical definition it isn't. It's only "middle range" if you jam in the Titan and TIs that are refreshes....and ignore anything coming from the low end.


You seem to have a funny way of claiming energy usage as a thing. "Even if you use 100% green energy it's going to cause global warming" was your thesis. Fine...then theoretically we could make a dent by simply expunging all radioactive materials and shooting them into space...right? No... Now the definition is just that anybody using power anywhere causes climate change... But, it's also not apparent that the huge string of "how efficient can you actually make it" means that all modern computing is wasteful slips by you. If you would open your eyes a little you'd see that the point here was the logical extension of your argument is stupid. If the logical extension is stupid it's your opportunity to refine it. If your response to that is to stick your fingers into your ears and yell louder, then you've demonstrated that you have a goal, but no real idea of why or how. For the record, the reasonable answer would have been: "While the use case itself is based upon desires rather than need, a card consuming this much power isn't a responsible way to use the limited resources we now have access to." That's something that can't be argued against...unless it's by the apathy of someone who flies around world leaders inside private jets to summits on global climate change.



Let's end on 1920x1080 resolution gaming. You want to bring things up to the next level...when there's literally no comparable hardware infrastructure. Why? Oh...that's personal opinion. Cool.
Now, let's not be bound by personal opinion. Why is 1920x1080 so much of a thing? Well, all TVs are setup at 720p, 1080p, or 4k. There are 8k exceptions...but it's just that. 720p is basically also a joke...so TVs are 1k and 4k. Most people can reasonably do a basic computer, and if they can snag a sub $200 card for good 1k gaming they've got something that effectively competes with modern consoles...but actually has a good game library. Funny that, what I'm really asking for is a reasonably priced option of card that most people can buy, and can effectively introduce more people to PC gaming.
Oh, but consoles do 4K now. Fine. You can get 4k with a huge asterix, for the $600 that you might be able to buy a unit from a scalper for. Not enticing. I'm asking for a genuinely good $200 or lower card for the masses...which will be significantly more valuable than a million 4080 cards. Not everyone cares about VR and high end gaming (two valid requirements for a massive leap forward in performance). That said, a fantastic "for the people" card in the next generation is what we need...hence my apathy to $1000+ offerings.
 
Looks like ill be going amd this time around. Not sure I want one of those in my mini ITX, and now that FSR 2.0 is out... no real reason to go with one of these.
 
Looks like ill be going amd this time around. Not sure I want one of those in my mini ITX, and now that FSR 2.0 is out... no real reason to go with one of these.
Have you seen TDP numbers for Radeon 7000 series?
 
Hi,
Nvidia sure isn't following ddr5 memory traits in lowering voltage for high speeds :laugh:
 
Have you seen TDP numbers for Radeon 7000 series?
They seem to align with the RX 6000 cards
 
Have you seen TDP numbers for Radeon 7000 series?
The funny thing is, nobody has seen real TDP figures for any next gen cards, they just all took the bait.
 
Usually 60 card is the previous 70/80-tier.
960 -> 770
1060/6 -> 980
2060 -> 1080
3060 -> 2070

SO can a 4060 match or surpass 3080, maybe not. but the new CUDA is doing FP32 and not sharing it with INT32, so 4608 CUDA found in a 4060 are actually 33% stronger and equal that of 3070Ti and 3080/10 is in a spitting distance that is attainable with the 2,5Ghz clock of the N4 node. I would say that yes, 4060 is beating the 3080/10.
 
It's kind of ironic considering the world is coming to grips with global warming and they keep turning the wick up on these parts. Hello, we want more performance at higher efficiency not less.
Im sure the 4080 IS more efficient than the 3080, so you got what you want, i dont know why you are complaining.
 
Please don't tell us we can't comment based on multiple sources of random rumours. What next? you want a serious discussion based on facts. Please go away. :D
Commenting is fine, but then there's this guy that apparently has already made a buying decision. And that's his right, no question about it. But I was curious if he knows something I don't.
 
i have an undervolted 3080 to 300watts, and that is really not comfortable in my room in summertime. if you get a good cooler on it 300watt could be pretty silent. i get about 1300rpm at 300watts. not ideal but the 4000 series make this impossible to improve upon.
 
High power CPU and GPU should be licensed and used by those who really need it, not by simple end-users who want to play games.
This would destroy the consumer market as we know it.
 
Usually 60 card is the previous 70/80-tier.
960 -> 770
1060/6 -> 980
2060 -> 1080
3060 -> 2070

SO can a 4060 match or surpass 3080, maybe not. but the new CUDA is doing FP32 and not sharing it with INT32, so 4608 CUDA found in a 4060 are actually 33% stronger and equal that of 3070Ti and 3080/10 is in a spitting distance that is attainable with the 2,5Ghz clock of the N4 node. I would say that yes, 4060 is beating the 3080/10.

Interesting months ahead. Nvidia is in a trouble if this is true:
Navi 33 breakdown
Regardless, Navi 33 will be AMD new entry-level/budget GPU class competing with the likes of NVIDIA’s RTX 4060 and other GPUs in its class, possibly even one from Intel. Navi 33, like the rest of the RDNA 3 lineup, will be fabricated using TSMC‘s 6nm process node and feature a monolithic die. Navi 31 and 32, on the other hand, are set to be the industry’s first GPUs utilizing a MCM (Multi-Chip-Module) design.

MILD’s sources say that the die size for Navi 33 will be between 360mm – 460mm² and that it will carry 128MB of Infinity Cache, though the possibility of 256MB of Infinity Cache is still on the table. Now, the crazy part about this GPU is that it’s said to offer performance on par with the Radeon RX 6900XT. That’s right, the top-end flagship RDNA 2 GPU is about to be bested by a budget offering very soon.
AMD RDNA 3 Will "Decimate" NVIDIA in Efficiency with its Next-Gen Budget GPU Being Faster Than the Radeon RX 6900XT - Appuals.com
 
Wut? Im just pointing out that these gpus are exactly what you want, more efficient than the last gen.
 
Last edited by a moderator:
compared to 3080, 2080 is 85% efficient, trading 100 watts for 66% performance. now another 100 watts for 66%

compared to 3090Ti, 4080 probably has 33% more transistors to account for the double amount of rops and separate INT core and newer RT. therefore 33% more power, so efficiency is better but kind of like 2080 Ti and 3080, same relation. basically 33% more performance for 13% more power.
 
When there is a process node shrink, there is always an improvement in the performance per watt metric.

RTX 3060 is around 20% faster than RTX 2060, while around 5% more power consuming.

That is from TSMC 12nm to Samsung 8N node.

1654627315889.png

NVIDIA GeForce RTX 2060 Specs | TechPowerUp GPU Database

1654627334783.png

NVIDIA GeForce RTX 3060 Specs | TechPowerUp GPU Database

1654627293135.png
 
cool......

rtx 4080 16gb = TDP 420w
rtx 4080 ti 20gb = TDP 450w

nice, nvidia..... i love it very much..........as long as i have 1200w psu pure gold......
 
Somebody from gov should regulate this nonsense. This is getting ridiculous. A GPU that consumes more then a standard PSU from a couple of years ago..
Only if you want progress to stop but expense to increase without bound.

...and don't forget all the offended stakeholders.

Relax, free markets will ruthlessly optimize the parameters of the device quicker then any government agency can assemble a time wasting, money wasting round table group.

Free markets brought you all the great tech you now enjoy. It is a self optimizing system. It will fix itself.
 
Last edited:
It seems that we can anticipate a fairly huge leap in performance and power consumption considering the smaller node and new architectural features. My expectation (or hope, if you prefer the term) is that cards such ast the upcoming RTX 4060 and their AMD equivalent will be quite powerful and get the job done for most users at a perfectly efficient TGP of around 220 Watts (or a bit more). None of these cards will be cheap, though.
 
For me efficient would be something like this.

60W TDP xx60
100w TDP xx70
150w TDP xx80
200w TDP xx90

1070 had 150w, we should have made gains by then to shave a 3rd of it.
 
For me efficient would be something like this.

60W TDP xx60
100w TDP xx70
150w TDP xx80
200w TDP xx90

1070 had 150w, we should have made gains by then to shave a 3rd of it.

But the whole idea is to get more performance for the same wattage with new generations otherwise we would just continue to fall backwards because games will continue to require more performance and more resources in the future and that's always been true.

Granted the 4XXX series is looking like a power hog unless the performance warrants it. We will see.
 
Back
Top