• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is Nvidia going to release a full die Ada GPU?

No one is coming to save us :(
As long as fools keep buying into features rather than actual hardware, can you blame them?

Yes, this is a stab, get real. We fed this beast. We all did and we still do. Can't wait for the next DLSS/FSR article! Weeee
 
As long as fools keep buying into features rather than actual hardware, can you blame them?

Yes, this is a stab, get real. We fed this beast. We all did and we still do. Can't wait for the next DLSS/FSR article! Weeee
Maybe cause hardware hit a wall? The XTX is almost twice the "hardware" of the 4080 and it barely beats it in raster while losing immensely in RT. How is that "hardware" working out for you? It doesn't.

DLSS is trying to give you 99% of the image quality with 50% of the grunt required. And as is usual the case, only people that buy AMD cards complain about these features. Mostly cause they are half baked on their cards and awful, so they try to paint nvidia's features with the same brush. This needs to stop...
 
LLMs don't have a key front end use case that can be monestised yet. Unless they get one, the AI hype will bust harder than Cisco in 2000.

All the data centre business relies on product's using those data center's being sold to businesses or consumer. We're already seeing Microsoft scale back.

If that happens Nvidia going to focus on GPU directly to consumers again. We might not see a 4090 ti.

What we'll probably see is a return to them selling products at price points to shift card volume and a battle for who gets console and handheld chip business again.
 
Maybe cause hardware hit a wall? The XTX is almost twice the "hardware" of the 4080 and it barely beats it in raster while losing immensely in RT. How is that "hardware" working out for you? It doesn't.

DLSS is trying to give you 99% of the image quality with 50% of the grunt required. And as is usual the case, only people that buy AMD cards complain about these features. Mostly cause they are half baked on their cards and awful, so they try to paint nvidia's features with the same brush. This needs to stop...
Hardware didn't hit a wall at all, there is still a major generational gap. Potentially. It requires vendors to actually equip GPUs proper instead of using lackluster configurations. +30% is fine gen to gen and its still easily possible as proven by several recent GPUs including the 7900XTX.

Also you really gotta stop trying to make everything an AMD/Nvidia feud, its not. AMD pushes FSR just the same. 'This needs to stop' indeed. The complaints are universal, imho. Its also a stab at myself - we ALL do it. GPUs still make these companies a shitload of money and they're getting great margins because we are all still too eager.
 
No they probably won't because they have no reason to.
We might still see a 4080 Ti tho.

AMD won't even compete in high-end market with RDNA4. Nvidia officially stated that next gen is 2025 and I believe them. AMD is no threat and they have full focus on AI.

Even AMD starts to chase AI now and somewhat abandoms gaming race, high-end at least:


 
Here we go again :(
becoming a nvidia vs AMD shitshow
 
Hardware didn't hit a wall at all, there is still a major generational gap. Potentially. It requires vendors to actually equip GPUs proper instead of using lackluster configurations. +30% is fine gen to gen and its still easily possible as proven by several recent GPUs including the 7900XTX.

Also you really gotta stop trying to make everything an AMD/Nvidia feud, its not. AMD pushes FSR just the same. 'This needs to stop' indeed. The complaints are universal, imho. Its also a stab at myself - we ALL do it. GPUs still make these companies a shitload of money and they're getting great margins because we are all still too eager.
But DLSS gives you that 30% with minimal investment in hardware and also minimal loss of image quality, if even that, since sometimes it just straight up looks better.

Whenever I see people complaining about DLSS or RT, I check their sigs, and lo and behold...
 
The only scenario I can see a full ada die being released to gamers is RTX 50 series being delayed into late 2025 and the AI bubble popping neither is very likely. A refresh with better 4060ti-4070ti wouldn't surprise me though.

5000 series will be released in 2025 according to the official Nvidia road map.

They can't refresh 4070 Ti really, it's using the entire AD104 die.

We might see a 4080 Ti with 20GB GDDR6X using 320 bit bus tho. If Nvidia bothers at all, because it will be priced too close to 4090.
 
But DLSS gives you that 30% with minimal investment in hardware and also minimal loss of image quality, if even that, since sometimes it just straight up looks better.

Whenever I see people complaining about DLSS or RT, I check their sigs, and lo and behold...

So why are Nvidia still selling us cards at $1600? If it's all about the software the cards don't require such grunt.

The truth is, nothing in the domestic consumer field is ready for full ray-tracing, let alone path-tracing. But NV have pushed it hard to legitimise excessive prices. I'd much rather see a clear bifurcation of product lines. A more expensive powerhouse, and then a much more sensible cascade of tiers (as it used to be). But Nvidia have jacked it all too high, and they confuse the consumer by insisting on selling people on effects which are (at the moment) arguably icing.

As far as needing RT, or insisting it's necessary to get full enjoyment - I quote from w1zz's review for the recent Cyberpunk 2077 patch:

When focusing on gameplay, the title actually still looks fantastic without ray tracing enabled, which is good news for the masses of gamers that want to enjoy the game but don't have the hardware for ray tracing.

There's a flip side - game devs could do a far better job. I look at Days Gone as a fine example of a great looking game. It doesn't need crazy hardware or AI options to run well. And it's relatively old on PC (4 years).


Folks are just getting used to listening to what NV keep saying, and paying more and more for it.
 
So why are Nvidia still selling us cards at $1600? If it's all about the software the cards don't require such grunt.
Are you suggesting they should sell software for free or what?

The truth is, nothing in the domestic consumer field is ready for full ray-tracing, let alone path-tracing. But NV have pushed it hard to legitimise excessive prices. I'd much rather see a clear bifurcation of product lines. A more expensive powerhouse, and then a much more sensible cascade of tiers (as it used to be). But Nvidia have jacked it all too high, and they confuse the consumer by insisting on selling people on effects which are (at the moment) arguably icing.

As far as needing RT, or insisting it's necessary to get full enjoyment - I quote from w1zz's review for the recent Cyberpunk 2077 patch:



There's a flip side - game devs could do a far better job. I look at Days Gone as a fine example of a great looking game. It doesn't need crazy hardware or AI options to run well. And it's relatively old on PC (4 years).


Folks are just getting used to listening to what NV keep saying, and paying more and more for it.
Of course nothing is ready for full ray tracing. And of course graphics isn't everything that matters, BUT, when you are spending 600 at the minimum (which is basically 4070 / 7800xt territory) you are definitely expecting to see the fireworks. It makes absolutely no sense to spend 600+ on a GPU and not care about graphics. And since we established you care about graphics, nvidia gives you the best for your money. Period. That's not even debatable. So...

I don't who these folks that listen to nvidia are. I was waiting for cyberpunk to release for years, and I wanted to enjoy it with all the bells and whistles. AMD can't do that even today, 3 years later, so my only option is buying nvidia. When that changes, ill be buying AMD, but it seems that they are stuck a generation behind so ...
 
Are you suggesting they should sell software for free or what?


Of course nothing is ready for full ray tracing. And of course graphics isn't everything that matters, BUT, when you are spending 600 at the minimum (which is basically 4070 / 7800xt territory) you are definitely expecting to see the fireworks. It makes absolutely no sense to spend 600+ on a GPU and not care about graphics. And since we established you care about graphics, nvidia gives you the best for your money. Period. That's not even debatable. So...

I don't who these folks that listen to nvidia are. I was waiting for cyberpunk to release for years, and I wanted to enjoy it with all the bells and whistles. AMD can't do that even today, 3 years later, so my only option is buying nvidia. When that changes, ill be buying AMD, but it seems that they are stuck a generation behind so ...
You imply there are no fireworks on raster or without RTX, and that's just bottom barrel nonsense to suit your agenda/point. In the same way you're stepping past anything that doesn't fit your view on this.

In the same way you're making way more (too much - in my opinion) of upscaling techs when its clear everything is moving that way. One company has a slight advantage there, we all know this, its not a point of discussion, and yet, you manage to bring it up every time as if it is one. Any nuance is lost on you after that, you're just mostly working hard at pushing your point that DLSS is king of everything and the path to world peace, and end to hunger and happy GPU land. IMHO you've lost the plot completely. All I see is fools money and marketing.

The fact is we're paying for software on a hardware purchase now - you admit this too, so I don't think we disagree a whole lot. You just attribute far more value to that software than others do. That's a you-perspective. Good to keep in mind.
 
You imply there are no fireworks on raster or without RTX, and that's just bottom barrel nonsense to suit your agenda/point.
No I did no such thing. BUT, yes - in general - fireworks are better with RTX.

In the same way you're making way more (too much - in my opinion) of upscaling techs when its clear everything is moving that way. One company has a slight advantage there, we all know this, its not a point of discussion, and yet, you manage to bring it up every time as if it is one. Any nuance is lost on you after that, you're just mostly working hard at pushing your point that DLSS is king of everything and the path to world peace, and end to hunger and happy GPU land. IMHO you've lost the plot completely. All I see is fools money and marketing.

The fact is we're paying for software on a hardware purchase now - you admit this too, so I don't think we disagree a whole lot. You just attribute far more value to that software than others do. That's a you-perspective. Good to keep in mind.
And it's also a you perspective that it's a bad thing. That's why we are here, sharing our perspectives.

I can share some screenshots or videos with DLSS on and off, if you can tell the difference then you might have a case. If not then why the hell are you even arguing here? Yes, DLSS is the best thing that ever happened to PC gaming. In my opinion of course. Increased image quality while boosting performance by 25-30%. That's insane.
 
No I did no such thing. BUT, yes - in general - fireworks are better with RTX.


And it's also a you perspective that it's a bad thing. That's why we are here, sharing our perspectives.

I can share some screenshots or videos with DLSS on and off, if you can tell the difference then you might have a case. If not then why the hell are you even arguing here? Yes, DLSS is the best thing that ever happened to PC gaming. In my opinion of course. Increased image quality while boosting performance by 25-30%. That's insane.
I didn't, but I did. :D Its lovely exchanging perspectives with you isn't it

Your DLSS drivel is known by now, no need to post it in every topic where you smell a speck of Nvidia or AMD buddy. You love it, Fantastic, its insane, /thread.
 
I didn't, but I did. :D Its lovely exchanging perspectives with you isn't it

Your DLSS drivel is known by now, no need to post it in every topic where you smell a speck of Nvidia or AMD buddy. You love it, Fantastic, its insane, /thread.
If you don't properly read what I'm saying, sure. Im saying a game can have nice graphics without RT, but adding RT to a game that has good graphics already makes them better.

I don't love it. I'm saying it increases image quality and adds fps all in one. Whoever is complaining about it does so because of an agenda, else they would submit themselves to a blind test that would take like, 3 minutes.
 
Untill nvidia or amd figure out how to use dual gpu dies where the system looks at the gpu as one large die it'll stay dead.
AMD did that with the hd 7970x2
But since dual GPU isn't anymore in the dx12 api it's to the devs to make it happend and since people don't have the money for dual GPU (cuz of the-one-company-that-we-can't-name) they wont
 
LLMs don't have a key front end use case that can be monestised yet. Unless they get one, the AI hype will bust harder than Cisco in 2000.

All the data centre business relies on product's using those data center's being sold to businesses or consumer. We're already seeing Microsoft scale back.

If that happens Nvidia going to focus on GPU directly to consumers again. We might not see a 4090 ti.

What we'll probably see is a return to them selling products at price points to shift card volume and a battle for who gets console and handheld chip business again.
The actual machine-learning wave (which is entirely distinct from the LLM bubble) is unlikely to crest anytime soon. The former is where all the GPUs are going, because ML models can discern useful trends from raw data that even the best human data analysts cannot, and those trends are becoming the new competitive advantage for companies that are aware of them versus those that aren't - and discovering those trends in a world where more and more data is being recorded more and more granularly, is going to require more and more powerful ML accelerators capable of more and more analyses. Which is why I would be entirely unsurprised if, at some point in the not-too-distant future, NVIDIA abandons the graphics rendering market to purely focus on ML acceleration capabilities - there's so much ML hardware that's being shipped in their consumer GPUs, that they literally had to invent technology (DLSS) to make it not be a complete waste of die space.

Regarding the "AI"/LLM side, being unable to monetise unicorns and rainbows hasn't stopped unscrupulous tech bros from successfully selling unicorns and rainbows to idiot venture capitalists before - why should this time be any different?

On a more serious note, there is obviously a large demand surge for LLM-training-acceleration hardware that is going to abate quite abruptly when the LLM bubble pops - but relative to the ML acceleration demand, that bubble is ultimately insignificant. NVIDIA will take a ding to their profits similarly to what happened after the crypto boom went bust, but in the long term it won't affect them majorly.

I didn't, but I did. :D Its lovely exchanging perspectives with you isn't it

Your DLSS drivel is known by now, no need to post it in every topic where you smell a speck of Nvidia or AMD buddy. You love it, Fantastic, its insane, /thread.
Yet you're the one who first brought up DLSS/FSR in this thread... so could you take your own advice, and please f**king not? I seriously don't care whether you were being sarcastic, I just care that we don't get yet another possibly interesting thread degenerating into green vs red genital size arguments and being locked as a result.
 
Last edited:
Are you suggesting they should sell software for free or what?

Don't see where I even inferred that. The point was pretty clear. If software is running the show, the hardware pricing should change.

You shouldn't need to pay £800 minimum for a third tier product (which I did) to play at 1440p with bells and whistles.
 
I would be more interested in cut down AD103 die, for example 4070ti 16GB
 
AMD did that with the hd 7970x2
But since dual GPU isn't anymore in the dx12 api it's to the devs to make it happend and since people don't have the money for dual GPU (cuz of the-one-company-that-we-can't-name) they wont

There have been multiple dual gpus from both companies but they still acted like two separate gpu's working in SLI/crossfire. I am talking about AMD using it's chiplet approach to use two GCD that are viewed by the OS as one large gpu.

5000 series will be released in 2025 according to the official Nvidia road map.

They can't refresh 4070 Ti really, it's using the entire AD104 die.

We might see a 4080 Ti with 20GB GDDR6X using 320 bit bus tho. If Nvidia bothers at all, because it will be priced too close to 4090.

I was more talking about a refresh of existing cards ala the super lines still using ADA. More memory higher clocks and even though It wouldn't surprise me I don't think will actually happen.
 
Don't see where I even inferred that. The point was pretty clear. If software is running the show, the hardware pricing should change.

You shouldn't need to pay £800 minimum for a third tier product (which I did) to play at 1440p with bells and whistles.

Let say you pay 20usd for a steak at the supermarket and cook it yourself vs pay 100usd for the same steak being cooked by a master chef at a fancy restaurant. Surely the hardware (steak) is the same, yet high class restaurant can charge way more for their food.

I can't see how that's different with GPU, want better looking games (or higher FPS, which is also better looking), just gotta pay up.

Other than that even a 2060 can play games at 4K, just use appropriate settings and/or DLSS.
 
I set myself a price limit of £800. I did not want to pay more - not because (at the time) I couldn't, but because I wouldn't. The steak analogy sort of rebounds because I have my own thoughts about people who pay so much for food, just to eat at a fancy restaurant.

I just don't get why folk defend Nvidia pricing. It's self-defeating and elitist. Yes, as a company they can charge what they want, but it seems counter to the entire hobby PC scene to just blindly accept it. Team red aren't any better. Their initial price point for the XTX was silly as an AMD product, in the same way the 4080 at £1200 is stupid, and the XT was worse. The whole stack is warped.
 
I set myself a price limit of £800. I did not want to pay more - not because (at the time) I couldn't, but because I wouldn't. The steak analogy sort of rebounds because I have my own thoughts about people who pay so much for food, just to eat at a fancy restaurant.

I just don't get why folk defend Nvidia pricing. It's self-defeating and elitist. Yes, as a company they can charge what they want, but it seems counter to the entire hobby PC scene to just blindly accept it. Team red aren't any better. Their initial price point for the XTX was silly as an AMD product, in the same way the 4080 at £1200 is stupid, and the XT was worse. The whole stack is warped.

100% agree pricing is shite top to bottom although flagships cards have always been priced stupid so I can give them a pass. The thing is I think things are only going to get more expensive going forward or we will get much smaller bumps gen on gen. The only cards from Nvidia that maintained price parity this generation either came with 50% less vram with the 4060 or was a very small gen on gen improvement with the 4060ti. I guess only time will tell.

GPUs are definitely a luxury item at this point though and people just have to vote with their wallets on what they are willing to accept pricing wise plenty of people buying cards I feel are stupid at the $8-1200 range though so it is what it is.
 
So why are Nvidia still selling us cards at $1600?

Because this

performance-pt-2560-1440.png


And the other hundred things or so that their cards support comprehensively and AMD fumbled/doesn't/won't/has no intention to ever, perhaps.

The clear bifurcation has been done in Turing when they released the GTX 16 series without the tensor and raytracing cores to offer a more affordable option, but going forward, every hardware will be supporting ray tracing. It's even made its way to phones, the Snapdragon 8 Gen 2 and the Apple A17 both have hardware RT support. Over time, we'll eventually achieve path tracing even on mobile SoC's, although that may still be some time ahead. Nvidia has leverage, but I don't think they have all that much leverage to move even Apple and Qualcomm in their firmly established niches, truth is, this RT thing has been coming for a very long time and the only reason we haven't seen it more widely employed, other than hardware performance, is that game development costs are skyrocketing due to the high quality assets - even if RT actually reduces development time and costs because you can simply raytrace everything that used to be cube, bump mapped etc. - lots of time and money goes into assets for high-fidelity raster graphics.

With the next generation of consoles slated for around 2028 (from what we saw from the Xbox documents from the FTC x Microsoft litigation regarding ABK), it's reasonable to believe that we'll likely be seeing at least the RTX 4090's level of performance on affordable hardware by then. It remains to be seen if the high end PC GPUs will keep up, but I've no doubt that they will. Can't speak for price, of course... but I have a feeling that Nvidia's aware that this $1500 soft target is about where people drew the line, they are willing to pay for it (4090 has a very high market penetration rate, surprisingly), but once you go above this people stop agreeing fast (see: 3090 Ti's pricing and market performance).
 
The question is, how many people would buy a 4090ti when the release of the 5090 is nearing?
I remember reading this exact same argument w/the 3090ti -- so maybe you're right! :laugh:

Just for the record I wasn't necessarily referring to a consumer full die Ada part but rather an AI/HPC/professional full die Ada part. I believe Nvidia had always released a full die GPU for at least the last 5 generations -- either for the AI/HPC/professional segment or the consumer segment or both.

Are you talking about the GTX 1080 sli article he did I don't think there was a 1080ti sli testing. 1080s came out nearly a year prior and support at that time was generally better but the biggest issue with sli was framtimes which a lot of reviews ignored back then.

I don't remember TPU ever doing one for 1080ti.
If you read the article you would see 1080ti SLI results are included in every game benchmark. I was amazed to see the 1080ti and 1080ti SLI beating out the 2080ti and 2080ti SLI in a few titles (I believe GTA V was one of those titles).

LLMs don't have a key front end use case that can be monestised yet. Unless they get one, the AI hype will bust harder than Cisco in 2000.

All the data centre business relies on product's using those data center's being sold to businesses or consumer. We're already seeing Microsoft scale back.

If that happens Nvidia going to focus on GPU directly to consumers again. We might not see a 4090 ti.

What we'll probably see is a return to them selling products at price points to shift card volume and a battle for who gets console and handheld chip business again.
I thought the console market was dominated by AMD because they have the best iGPU and SOC for that market? Intel/Nvidia don't have anything that can compete w/that do they?
 
I thought the console market was dominated by AMD because they have the best iGPU and SOC for that market? Intel/Nvidia don't have anything that can compete w/that do they?
Contrary to popular belief, the console market is dominated by nvidia, it has sold more tegra chips than ps5 and xbox combined. But, as per the usual, amd fans are the loudest, especially spreading misinformation

Don't see where I even inferred that. The point was pretty clear. If software is running the show, the hardware pricing should change.

You shouldn't need to pay £800 minimum for a third tier product (which I did) to play at 1440p with bells and whistles.
That's not a hardware issue though. Even if they gave you a 4090 for 800$, there will be games that you can't play at 1440p. See immortals of avernum for example. Devs just do less and less optimization the more hardware you have
 
Back
Top