• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is Nvidia going to release a full die Ada GPU?

Joined
Dec 12, 2020
Messages
1,755 (1.09/day)
According to the TPU GPU DB, a full die ADA GPU would have the following specs (Shaders / TMUs / ROPs): 18432 / 576 / 192; yet it doesn't seem Nvidia is going to release a GPU w/these specs because the Titan ADA has been cancelled and none of the AI/HPC products are full die ADA parts (the RTX 6000 Ada Generation comes closest). Is the difference in perf. between a RTX 6000 Ada Generation and a full die ADA part too small to be worth it? Even in the professional HPC/AI product segment?
 
According to the TPU GPU DB, a full die ADA GPU would have the following specs (Shaders / TMUs / ROPs): 18432 / 576 / 192; yet it doesn't seem Nvidia is going to release a GPU w/these specs because the Titan ADA has been cancelled and none of the AI/HPC products are full die ADA parts (the RTX 6000 Ada Generation comes closest). Is the difference in perf. between a RTX 6000 Ada Generation and a full die ADA part too small to be worth it? Even in the professional HPC/AI product segment?
You missed the hubub about the 4090 ti didn't you?
 
It was canceled so that they could focus on AI.

L40 is probably the closest thing we will see and probably what the 4090ti was going to be.

 
Is the difference in perf. between a RTX 6000 Ada Generation and a full die ADA part too small to be worth it? Even in the professional HPC/AI product segment?
Well, Nvidia seems to have answered that by canceling the 4090 Ti. After all, they know exactly how many 3090 Ti cards were sold.

Nvidia has allocated their best silicon to their datacenter customers for years. Those people truly value performance-per-watt. PC hardware enthusiasts rarely do, even here at TPU.
 
Even if I could, dropping 3K on a GPU is dumb.

Cmon Intel, shake up the market and give these guys a good smack for us :)
 
Even if I could, dropping 3K on a GPU is dumb.

Cmon Intel, shake up the market and give these guys a good smack for us :)
Gelsinger isn't so stupid.

If he thinks Intel's GPU silicon can make more money in the hands of datacenter customers, guess what he'll do? He's an old school Intel engineer who ended up as VMWare CEO and EMC COO. He's very enterprise focused.
 
No one is coming to save us :(
At least in 2023, there are no knights in shining armor on white horses to save the consumer GPU damsel in distress. They are all off on the Crusades fighting each other in the War of the Datacenters.

All these companies will keep a few premium PC graphics cards around. After all, all video games are designed on PCs. But PC gaming (and its hardware) will end up like 35mm film photography. A niche hobby for those with deep wallets.

Cloud gaming -- much derided by people here and elsewhere -- continues to make inroads. You can say all the negative things you want about cloud gaming but I guarantee you that it is not standing still. It might not be the best solution for everyone all the time, but for sure it will be a great choice for many for certain situations (and those situations grow all the time).

Think about Baldur's Gate 3. It's a turn-based strategy RPG that really doesn't rely on super quick input latency. Or Microsoft Flight Simulator. With a cloud gaming service like GeForce NOW, the user doesn't need to worry about installing the latest driver or that 20GB patch. That's today. This stuff continues to forward.

The PC gaming hardware business is very limited and difficult to break into. And the people who can design this silicon know where the money is. Even if a bunch of Nvidia/AMD/Intel engineers left to form a startup, they'd focus on the explosively growing AI accelerator market instead of the largely stagnant consumer PC market.

Remember that AI didn't appear out of nowhere. These companies have been working on differentiated silicon for machine learning for years, even Apple who doesn't sell their silicon. The AI frenzy was bound to happen at some point. Even if Apple isn't competing in silicon sales, they are competing for talent: every engineer that signs up to work on Apple's Neural Engine is a person who is not collecting a paycheck from AMD, Intel, Nvidia, Amazon, Alphabet, Meta, etc.
 
Last edited:
Well, Nvidia seems to have answered that by canceling the 4090 Ti. After all, they know exactly how many 3090 Ti cards were sold.

Nvidia has allocated their best silicon to their datacenter customers for years. Those people truly value performance-per-watt. PC hardware enthusiasts rarely do, even here at TPU.
This makes sense but then why no full size Ada die for DC customers this gen.? Isn't it the case in the past that nvidia released full size dies at least for the HPC/professional segment?
 
This makes sense but then why no full size Ada die for DC customers this gen.? Isn't it the case in the past that nvidia released full size dies at least for the HPC/professional segment?
The L40 is basically that.
 
But it's not R-T-B, it's close but it's not a full size Ada die. Ampere, Pascal and Turing all had full size dies released with nothing cut. With Pascal, the Titan X Pascal was pretty close to being a full die but it wasn't and they still came out w/the Pascal Titan Xp (which was).
 
Obviously the gaming world needs a full die Ada -- Cyberpunk 2077 w/path tracing enabled practically requires it!
 
  • Like
Reactions: N/A
Obviously the gaming world needs a full die Ada -- Cyberpunk 2077 w/path tracing enabled practically requires it!

I mean would 5 more FPS really make that much of a difference and that is assuming 20% scaling which more likely it would be 15% at best. Regardless of full die or not you will still need DLSS and frame generation and the likely cost would be anywhere from $400-900 usd more. it really wouldn't make that much sense for 99.9% of buyers. Kinda like the Matrix card that just came out.
 
But it's not R-T-B, it's close but it's not a full size Ada die. Ampere, Pascal and Turing all had full size dies released with nothing cut. With Pascal, the Titan X Pascal was pretty close to being a full die but it wasn't and they still came out w/the Pascal Titan Xp (which was).
I was under the impression it was. Guess I'm mistaken.

EDIT: Was thinking of this pretty sure, 99% sure this is full die:


EDIT EDIT: It is but it's Ampere. Jeeze I am all over the place tonight, ignore me lol.
 
I mean would 5 more FPS really make that much of a difference and that is assuming 20% scaling which more likely it would be 15% at best. Regardless of full die or not you will still need DLSS and frame generation and the likely cost would be anywhere from $400-900 usd more. it really wouldn't make that much sense for 99.9% of buyers. Kinda like the Matrix card that just came out.
It could've been doable for less -- if SLI and Crossfire were still operational tech.
 
It could've been doable for less -- if SLI and Crossfire were still operational tech.

I mean is 4000 usd worth just playing path traced cyberpunk when the developer has already stated it looks better with DLSS anyways. Just doesn't make a lot of sense and most games had poor scaling with sli 40-60% at best so the return on investment was already poor when two high end gpu cost $5-700 let alone at least $1600-2000. On top of that frame times where not very good.
 
I mean is 4000 usd worth just playing path traced cyberpunk when the developer has already stated it looks better with DLSS anyways. Just doesn't make a lot of sense and most games had poor scaling with sli 40-60% at best so the return on investment was already poor when two high end gpu cost $5-700 let alone at least $1600-2000. On top of that frame times where not very good.
There's an old line from drag racing, speed costs. For people who want to play CP2077 with path tracing on and get playable framerates, SLI would've been the ONLY option. Ironically enough, Crysis supported SLI.

Then there's this:

A 90% increase in FPS in Vulkan in Strange Brigade.

Personally, I think it was more a business decision than an engineering decision to dump SLI/Crossfire. Who knows where they could've gone if they had continued to develop the tech w/faster buses and faster interconnects?
 
Hard to say my last sli setup was 1080ti and it wasn't very good by that point in most games. From the 20 series on it didn't make a ton of sense due to the flagship cards jumping in cost drastically.

Untill nvidia or amd figure out how to use dual gpu dies where the system looks at the gpu as one large die it'll stay dead.
 
I just read an old W1zzard article testing a plethora of games in SLI, 1080ti in SLI really did surprisingly well in quite a few of them. I don't think his test system supported two PCIe 3.0 x16 slots either.

Here's the article if you're interested:
https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-sli-rtx-2080-sli-nvlink/5.html

Are you talking about the GTX 1080 sli article he did I don't think there was a 1080ti sli testing. 1080s came out nearly a year prior and support at that time was generally better but the biggest issue with sli was framtimes which a lot of reviews ignored back then.

I don't remember TPU ever doing one for 1080ti.
 
It seems very unlikely at this point. Unless Ada has an unusually protracted lifespan, the Super/Ti variants should have launched already. But they didn't, and there are no signs that they will be released.

Titan Ada's mark was Nvidia's 30th anniversary earlier this year, but they seem to have entirely cancelled that if it was even real at all. So the closest you'll get to a full-die Ada is the RTX 6000 Ada Generation which still only has 142 out of 144 SM enabled.
 
The only scenario I can see a full ada die being released to gamers is RTX 50 series being delayed into late 2025 and the AI bubble popping neither is very likely. A refresh with better 4060ti-4070ti wouldn't surprise me though.
 
The question is, how many people would buy a 4090ti when the release of the 5090 is nearing?
 
there is no reason for nvidia to release the 4090ti because amd doesnt have a gpu to beat the 4090
reason that 3090ti exists is that nvidia want to take the top spot. I felt sorry for people who bought the card and few months later 40 series was announced
the 3090ti was hella expensive
 
Back
Top