• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation

If the leak from kopite7kimi is true the 5090 is a Dual-slot GPU therefore it's probably Liquid-cooled like the MSI 4090 SUMPRIM LIQUID X ! If so then AIBs are going to struggle even more to make buyers want to buy theirs. I guess people who want Air-Cooling will go for AIBs but Water-Cooling is definitely going to become a Standard sooner or later if GPUs start pulling 600W+


As much as Naming doesn't mean anything the TITAN line is definitely aimed at Professionals. They have some FP64 cores that Consumer GPUs do not have, and they usually have 2x the amount of VRAM for Professional workloads too.
2 slot design all in one hybrid would be atypical for a vanilla flagship and has never been done before. Although that would be the most likely design unless they somehow were able to take advantage of the new active silicon cooling but doubtful they would take such a gamble with halo flagship.

 
Hoesntly at this point I hope this is a make or break point for Nvidia Graphic cards. In my point of view Nvidia should sell off it's Video card dept. Myabe then we might see some massive improvements. Looking at the specs so far it's just another machine learning Upscaler with DLSS. I do not want an upscaler I want a video card that can run native 4k or 8k no upscaler anything!. Nvidia used to make Graphic cards like that. Had a catchy logo too Nvidia the way it was meant to be Played. GTX vs RTX well Now it's all about the ray tracing. Funny I remember Games having ray tracing without a Video card needed. I have looked at some ray tracing converts and quite frankly the games look worse than the orginials. Why the hell would I want reflections of everything the walls the floors. If you fire a weapon in a coal mine with the lights off guess what there is no reflections black objects absorb light not reflect it.
 
2 slot design all in one hybrid would be atypical for a vanilla flagship and has never been done before. Although that would be the most likely design unless they somehow were able to take advantage of the new active silicon cooling but doubtful they would take such a gamble with halo flagship.

I think they will sooner or later but yeah we'll see! They avoided it until now so I'm curious what they're going to come up with!
 
Did any Titan after the GK110 based ones have any special FP64 performance? Nope. ;)

They've been just glamorized halo-tier cards with (almost) full die and with full memory bandwith tand with larger VRAM amount. That's why x90 is the Titan these days, just branded for gamers.


You were faster, looks that we said the same things.

True. All of the Titans after the first one were really just gaming cards. I checked the GeForce site on every one of them as they were released and they were all marketed there as the greatest gaming card ever in the various marketing blurbs. The only one that I can't recall seeing on the GeForce site advertised as a gaming card was the Titan V but it was an unusual GPU with the specs and the first one with a single GPU priced at $3,000. The Kepler Titan Z was priced at $3,000 but it was a dual GPU. The Titan Z and the 6090 and the R9 295X2 were the end of the era of SLI/Crossfire.

The Titans had some appeal to prosumers because of the amount of VRAM. As you said the 3090 and 4090 just filled the slot that the Titan once held. You can think of the 3090 and 3090 Ti as something like the Kepler Titan and the Titan Black.
 
Last edited:
Hoesntly at this point I hope this is a make or break point for Nvidia Graphic cards. In my point of view Nvidia should sell off it's Video card dept. Myabe then we might see some massive improvements. Looking at the specs so far it's just another machine learning Upscaler with DLSS. I do not want an upscaler I want a video card that can run native 4k or 8k no upscaler anything!. Nvidia used to make Graphic cards like that. Had a catchy logo too Nvidia the way it was meant to be Played. GTX vs RTX well Now it's all about the ray tracing. Funny I remember Games having ray tracing without a Video card needed. I have looked at some ray tracing converts and quite frankly the games look worse than the orginials. Why the hell would I want reflections of everything the walls the floors. If you fire a weapon in a coal mine with the lights off guess what there is no reflections black objects absorb light not reflect it.
In a way I wish they would go all-in in raw performance (Rasterization) with 4K and 8K Gaming as their target, but Ray Tracing can look amazing when done well! Metro Exodus Enhanced Edition is a perfect example, the game was rebuilt from the ground with RT and looks amazing! Path Tracing is the future and is too demanding right now, hence the need of DLSS and Frame Generation...
 
In my point of view Nvidia should sell off it's Video card dept.
Why would they do that? The hardware that goes in your video card is the exact same that ends up as inference accelerators on the cloud, or in workstations for professionals.

They are just making products with the above consumers in mind first, and then giving something "good enough" for gamers with a high markup, that most folks will pay anyway.

Most people here can complain and say they won't buy it, but Nvidia's revenue increase for their gaming sector tells that the actual market acts the other way around.
 
Why would they do that? The hardware that goes in your video card is the exact same that ends up as inference accelerators on the cloud, or in workstations for professionals.

They are just making products with the above consumers in mind first, and then giving something "good enough" for gamers with a high markup, that most folks will pay anyway.

Most people here can complain and say they won't buy it, but Nvidia's revenue increase for their gaming sector tells that the actual market acts the other way around.
Agree! People don't have to buy anything if they don't want to. Nobody is forcing them to buy!

There will always be rich and poor people but people have to choice to vote with their wallets. And to be fair if people never bought GPUs at crazy high prices during the pandemic we would not be in this situation...

Also chip manufacturing and price of components keeps rising all the time so those companies won't stop charging more anytime soon.
 
So when are they comming out? I got told I should wait a few months for 5070 instead of getting a 4070TiS for 843 € now. Can I get a 5070 that beats TiS around Xmas for 800 €?
 
Decent prices will be only in spring.
 
Decent prices will be only in spring.
For 5070 too or will they wait half a year to release it like the 4070? If so I would need to wait at least one year for it. 4070 TiS is almost all time low price from 950€ to 843€ on alza.de
 
For 5070 too or will they wait half a year to release it like the 4070? If so I would need to wait at least one year for it. 4070 TiS is almost all time low price from 950€ to 843€ on alza.de
First will be RTX 5090 and RTX 5080 about midrange RTX 5070 it’s hard to say when it will come out. You will need to wait at least 8 months from now to get a deal.
 
Last edited:
I hope the die is large it will improove cooling
But i'm sure not
 
First will be RTX 5090 and RTX 5080 about midrange RTX 5070 it’s hard to say when it will come out. You will need to wait at least 8 months from now to get a deal.
I aint got time to wait almost a year when my new 2k 200Hz monitor is on the way.
 
I aint got time to wait almost a year when my new 2k 200Hz monitor is on the way.
Look at used RTX 4070Ti or RX 7900 XT. It's rare to find one but maybe you get lucky. Buying new gpu at the end of it's life before new ones come out is not great idea.
 
Look at used RTX 4070Ti or RX 7900 XT. It's rare to find one but maybe you get lucky. Buying new gpu at the end of it's life before new ones come out is not great idea.
You won't get a good deal on this joke with a 12GB card. plus there's a 4070 Super new that does the same.
So when are they comming out? I got told I should wait a few months for 5070 instead of getting a 4070TiS for 843 € now. Can I get a 5070 that beats TiS around Xmas for 800 €?
Are you sure the 5070 can be 12GB. If you must I still suggest getting a 4080 Super for €999 and enjoy the full 64MB L2$ instead.
Although the 5080 might cost the same and have all the new DLSS/FG and RT goodies that would work better and a revised more efficient thread engine.
 
Well,
I aint got time to wait almost a year when my new 2k 200Hz monitor is on the way.

The rumor is the 5090 and 5080 will at least be officially announced at CES in early January. They may get released then too but no one knows right now. My guess is the 5070 and 5060 will be a few months after that but that's just another guess.

There is also the possibility that the Blackwells will not be in supply enough to meet demand which leads to retailer price gouging as always. My plan is to wait about a year from now to pick up a 5090 after the dust settles so it was worth it to me to pick up a 4070 Super to hold me over until then. My 2070 Super, like yours, wasn't really cutting it at 1440p anymore and I made matters worse by deciding to give 4K a try which is great btw. All I have to do is put off playing any games that are GPU intensive for about a year.

If you don't want to wait on the 5070 you don't have to. You can pick up a 4070 Super and sell it later but as you know that decision will cost you a few hundred dollars.
 
Although that would be the most likely design unless they somehow were able to take advantage of the new active silicon cooling but doubtful they would take such a gamble with halo flagship.

Interesting video but it is innowative cooling for medium power mobile not for power hungry stationary GPU. 5W of removed heat per 1W of power supply is not an impressive ratio. Scaling it up to 400 W GPU it would need 80 W power to supply this kind of cooler alone.
 
Well,


The rumor is the 5090 and 5080 will at least be officially announced at CES in early January. They may get released then too but no one knows right now. My guess is the 5070 and 5060 will be a few months after that but that's just another guess.

There is also the possibility that the Blackwells will not be in supply enough to meet demand which leads to retailer price gouging as always. My plan is to wait about a year from now to pick up a 5090 after the dust settles so it was worth it to me to pick up a 4070 Super to hold me over until then. My 2070 Super, like yours, wasn't really cutting it at 1440p anymore and I made matters worse by deciding to give 4K a try which is great btw. All I have to do is put off playing any games that are GPU intensive for about a year.

If you don't want to wait on the 5070 you don't have to. You can pick up a 4070 Super and sell it later but as you know that decision will cost you a few hundred dollars.
If it's like Lovelace then the 5090 is not going to drop in price, more like the opposite... Also the 5090 might be sold $2000 so I hope you're saving already.
 
Probably going to sit this generation out unless they happen to be really great cards. Even then probably only buy 1 5090 or 5080 and waterfall upgrade all my rigs. all the PCs in my house have enough GPU power for now except my 4K setups.
 
Nobody but the most avid overclocker will buy the 5090, not because it will 100% cost >= 2500$, but because it will break most limits of weight, size and power draw of the majority of systems. 600W GPU is something meant for data centers, not home computers.

Good lord. Stop with the hysteria. There will be no huge uprising against Nvidia. The days of PC gaming will only be affordable to the Warren Buffetts of the world are not coming either. PC gaming has gotten more expensive even factoring in inflation but it's not time for panic. We've weathered several mining crazes and a pandemic that bred scalpers causing absurd prices and we will survive the AI craze as well. Nvidia isn't going to price themselves out of the gaming market. Huang is not a damn fool. We are talking billions and billions of dollars for them. Yes, the AI market is many times more but it's not the entire market. AMD will still be making entry level and midrange GPUs and there's even a possibility that Intel will survive in the dGPU market. Software will continue to improve. You don't have to buy a 5080 or 5090 to still have a great experience with PC gaming unless you are one of the 4% gaming at 4K but the other 96% will be fine.

Hell, even the argument that the dGPU prices are driving the masses to consoles is questionable. From what I've heard the PS5 Pro is $700 and the next gen consoles may even be more. Gaming is getting more expensive.
But Nvidia's 5080 and 5090 class GPUs enjoy a monopoly segument therefore they can price it 50% more than previous gen and they will still not be out of the market. They obviously know the price elasticity of these products and knowing Jensen's leadership style very well as we all do, he will absolutely price these two at the very limit of that curve. 2500$ for the 5090 and 1500$ for 5080 sound very "reasonable" to me in a monopolistic scenario.
 
Nobody but the most avid overclocker will buy the 5090, not because it will 100% cost >= 2500$, but because it will break most limits of weight, size and power draw of the majority of systems. 600W GPU is something meant for data centers, not home computers.
With 32GB of GDDR7 it's going to be a best for machine learning, while being way cheaper than the enterprise offerings.
Just power limit it to a reasonable value (~300W or so) and you're golden. Rumors still point out to it being a dual-slot GPU, idk how, but it'd be nice nonetheless.

Reminder that the 4090 also supports 600W configs, but most people don't use it like so. Saying that the 5090 will be a 600W GPU doesn't mean that it'll draw 600W all time.
 
It should be good card for demaning RT games or modeling but with very narrow window for machine learning. For the later we will see soon much more energy effective alternatives for deep neural networks or even radically different alternatives like Kolmogorov-Arnold Networks.
But for a year or two it would be best available solution from the shelf - depend on scalpers demand.

edit
@igormp
edit2
In 5-6 years timeframe all soldered memory GPUs will be entry level class (focused on games) or obsolete imho.
Fiber connected memory large and modular banks will come to the mainstream at edge for machine learning tasks - its my prediction.
so 5090 with 32 or 48 GB dont exited me .
 
Last edited:
For the later we will see soon much more energy effective alternatives for deep neural networks
Doubt. There are many other alternatives in place, but the availability and the software stack for those is a pain.
Even ROCm is still a pain to use on AMD GPUs.
Nvidia will likely still have the reign on that area for the next 5 years without issues.
Kolmogorov-Arnold Networks
While the idea is nice, KANs are mathematically equivalent to your good old MLPs, and can also be run on GPUs.
I get your idea with "radically different alternatives", but your example is not one of those haha
 
While the idea is nice, KANs are mathematically equivalent to your good old MLPs, and can also be run on GPUs.
I get your idea with "radically different alternatives", but your example is not one of those haha
Take my "radically different " as Look up Tables are radically different hardware blocks than Matrix Multipliers.
I dont have on my mind of outer space solutions.:p
edit
https://semiengineering.com/hardware-acceleration-approach-for-kan-via-algorithm-hardware-co-design/
^ link to KAN implementation article ^
Nvidia will likely still have the reign on that area for the next 5 years without issues.
You are definitely overestimating AI bubble timeframe imho.
Yes they could reign but with quite new much energy efficient architectures = or they will die like dinos 65 mln years ago.
 
Last edited:
Take my "radically different " as Look up Tables are radically different hardware blocks than Matrix Multipliers.
I dont have on my mind of outer space solutions.:p
edit
https://semiengineering.com/hardware-acceleration-approach-for-kan-via-algorithm-hardware-co-design/
^ link to KAN implementation article ^
FPGAs have been used for many different designs already (so basically your idea of LUTs), but are not the most efficient parts for the current needs for most stuff.

As I said before, KANs and MLP are mathematically equivalent, so you can implement KANs in ways that are more fitting to GPUs (as in matmuls) as well, while still maintaining the same capabilities of the original idea:

You are definitely overestimating AI bubble timeframe imho.
Yes they could reign but with quite new much energy efficient architectures = or they will die like dinos 65 mln years ago.
I could say the same for your ideas about decoupled memory, but I believe neither of us have a crystal ball, right?
 
Back
Top