Sunday, May 5th 2024

NVIDIA to Only Launch the Flagship GeForce RTX 5090 in 2024, Rest of the Series in 2025

NVIDIA debuted the current RTX 40-series "Ada" in 2022, which means the company is expected to debut its next-generation in some shape or form in 2024, having refreshed it earlier this year. We've known for a while that the new GeForce RTX 50-series "Blackwell" could see a 2024 debut, which is going by past trends, would be the top-two or three SKUs, followed by a ramp up in the following year, but we're now learning through a new Moore's Law is Dead leak that the launch could be limited to just the flagship product, the GeForce RTX 5090, or the SKU that succeeds the RTX 4090.

Even a launch limited to the flagship RTX 5090 would give us a fair idea of the new "Blackwell" architecture, its various new features, and how the other SKUs in the lineup could perform at their relative price-points, because the launch could at least include a technical overview of the architecture. NVIDIA "Blackwell" is expected to introduce another generational performance leap over the current lineup. The reasons NVIDIA is going with a more conservative launch of GeForce "Blackwell" could be to allow the market to digest inventories of the current RTX 40-series; and to accord higher priority to AI GPUs based on the architecture, which fetch the company much higher margins.
Source: Moore's Law is Dead (YouTube)
Add your own comment

154 Comments on NVIDIA to Only Launch the Flagship GeForce RTX 5090 in 2024, Rest of the Series in 2025

#76
john_
AGlezBThat's not my experience so I was wondering where that came from. Do you mind sharing the links?
They follow the flaw. When Ryzen came out they still where promoting low core count Intel CPUs as more future proof than high core count Ryzen CPUs. When pointing at Steve that higher core count CPUs will be better, Steve wasn't agreeing. 2-3 years latter he was doing comparisons of R5 1600 with Intel CPUs saying that he had predicted that the 1600 will be more future proof. When their CPU audience has gone AMD, HU was more AMD friendly and when AMD came out with RX 5700 and latter with RX 6000 series, they where more balanced in their GPU reviews. That's I think the time when Nvidia tried to "convince" them to be more Nvidia friendly. When Intel came out with Hybrid CPUs they started going Intel again because their audience started looking at Intel CPUs again. When AMD failed with the RX 7000 series and after a nice trolling campaign against them, calling them AMD Unboxed, they started changing becoming more Nvidia friendly. Now they will use harsh titles when criticizing AMD, they will be really gently when talking about Intel or Nvidia. Tim is the biggest advertiser of DLSS today on the internet, at least when looking at channels with a million or more subscribers.
wolfYou know they're fairly neutral when people from both sides claim they're fanboys for the side they don't like. Pretty balanced overall to me, they give crap / praise to either side when it's due. The thing that I don't really enjoy is Steve's lengthy opinion pieces, and how he sets up some tests purely to make whatever point he sets out to make, easy to do when you control the entire testing methodology.
They where never 50-50. They always go with the flow. The last year at least, they are pro Intel - Nvidia.
Random_UserIf consider, how big the hype is over 4090, right now, and the price inflation it has, there's no wonder, why Nvidia wants to release the top of the crop 5090 first.
Nvidia has the opportunity to only offer the 5090 and keep the whole 4000 series in the market for many more months at the current prices. That way they can offer lower models in 2025 and even in the start of 2026, streching their product circle for gaming products to over 2 years. They can move their server products to more advanced nodes, where keeping their gaming products one node behind. That way they can better utilise TSMC's capacity.

JMO always
Posted on Reply
#77
AGlezB
john_The follow the flaw. When Ryzen came out they still where promoting low core count Intel CPUs as more future proof than high core count Ryzen CPUs. When pointing at Steve that higher core count CPUs will be better, Steve wasn't agreeing. 2-3 years latter he was doing comparisons of R5 1600 with Intel CPUs saying that he had predicted that the 1600 will be more future proof. When their CPU audience has gone AMD, HU was more AMD friendly and when AMD came out with RX 5700 and latter with RX 6000 series, they where more balanced in their GPU reviews. That's I think the time when Nvidia tried to "convince" them to be more Nvidia friendly. When Intel came out with Hybrid CPUs they started going Intel again because their audience started looking at Intel CPUs again. When AMD failed with the RX 7000 series and after a nice trolling campaign against them, calling them AMD Unboxed, they started changing becoming more Nvidia friendly. Now they will use harsh titles when criticizing AMD, they will be really gently when talking about Intel or Nvidia. Tim is the biggest advertiser of DLSS today on the internet, at least when looking at channels with amillion or more subscribers.
I understand your point. Not saying I agree, just saying I understand.

They certainly have been mistaken in some of their predictions but they don't have a crystal ball so as long as they make it clear those are predictions and nothing more I don't see the problem. After all, the only way to not be mistaken with predictions is not to make them and if you see their Q&As you'll notice they get asked a lot to do it to the point where "Tim's crystal ball" has become a long running joke.

Also, they're a Youtube channel and have a very active community. That means they need to respond to what their community is asking about and that will usually be about trending topics because that is how communities and media in general work.

And that last point about DLSS probably means you have good hardware and don't need to use it, which is fine, or you're using Radeon and can't, which is also fine. For the vast majority of gamers (with nVidia cards) DLSS is a good way to boost framerate at no extra cost and there is nothing wrong with that. If anything, blame game companies for releasing barely optimized games that require DLSS to reach playable framerates in anything but top shelf hardware.

PS: If you're an AMD fanboy/fangirl/fanwhatever I understand being bothered by Tim saying FSR is the worst between FSR, XESS and DLSS. Othewise I don't see why promoting DLSS for nVidia users would bother you.
Posted on Reply
#78
bonehead123
And so it begins....

Looks like the milk mastas and hype-train conductors will still have their jobs a little longer, hehehe :)
Posted on Reply
#79
trsttte
Given how they've been pricing things it's the only card that makes sense anyway, the value of the 4080 was terrible at launch, just hope they don't moon shot the price of it even more.
Posted on Reply
#80
umeng2002
Looks like AMD might have a winner cooking after all if nVidia™ is rushing just the 4090.
Posted on Reply
#81
Tech Ninja
Meanwhile AMD won’t have anything faster than XTX (which is slower than a 3070 in PT) till 2026!
Posted on Reply
#82
x4it3n
I really hope that AMD's RDNA 5 will be competitive with Blackwell in 2025 because if it is, we might see prices go down and Nvidia will probably be forced to release a 5090 Ti with a full GB202 die (or maybe just 2 SM disabled).
Unfortunately Nvidia are probably going to charge $2000 for the vanilla 5090 but if RDNA 5 is competitive then the 5090 will have to drop in price and the 5090 Ti will replace it at $2000. I would love for the 5090/Ti to stay around $1600 and not increase to $2000 but Nvidia are too greedy nowadays :'(
Posted on Reply
#83
k0vasz
Those who want the absolute highest performance of a GPU, will buy a 5090, no matter what. At this point, I don't even care about (real world) price, as it's so ridicolous.

I'm much more interested in the lower end, 5060-5070, Battlemage, and the equialent AMD card. I'm expecting real fun in that segment
Posted on Reply
#84
HOkay
Ah back to the fun game of when to sell the top current gen GPU for an optimum upgrade to the top next gen GPU. I dislike how Nvidia go about things, but you can't deny they make the best top end GPUs. I feel like I need one of the 4k240 OLED monitors before I actually need the upgrade, buuut that feels like a matter of time. Yes, I know I have a problem!

My guess is a whole new level of pricing bravado, I'm guessing $2,499 if it's 50% faster raster than a 4090, $2,099 if it's 30% faster raster.
Posted on Reply
#85
starfals
If it's priced anything like the current 2000-2500 bucks 4090... yeah. No thanks. You can keep it! You might as well just keep 6090 as well. I'm sure inflation and greed will be even a bigger factor with that card. If 5080 ain't priced well, I'm out of the next generation too. The way things are going, I'm upgrading CPU's more often than GPUs lol, and It's never been the case before. Thanks Ngreedia!
Posted on Reply
#86
Dimitriman
So just the $2500 5090 this year. How promising! =/
Posted on Reply
#87
harm9963
Hard to believe, it was like yesterday when I got my 4090, if the 5090 is earth shattering ,might get it .
Posted on Reply
#88
wolf
Performance Enthusiast
john_They where never 50-50. They always go with the flow. The last year at least, they are pro Intel - Nvidia.
Hard disagree that they're pro anyone, they just accurately cover product and feature pros and cons as those products and features release or evolve. So recently they certainly throw constant praise at the 7800X3D, but called out the Ryzen 5700 for being misleadingly named and poorly performing. They throw limited (and from my perspective, begrudging) praise at DLSS, becuase the testing bears out it's superiority, yet rightly call out the RTX 3050 6GB as being misleadingly named and poorly performing.

Over the many years of following them and watching / reading (on techspot) their content, I can't accept or agree that they are pro any company overarchingly, but (mostly) Steve's personal style is such that you could wear whatever tint glasses you want when watching and glean from it that they are anti that color tint. Poor Steve Walton allows himself to get a bit too salty and baited by the fanboys at times, as evidenced by his "I told you so" commentary directly addressing them, and my aforementioned example of tests purposely constructed by him, to make his point du jour.

Personally I consume mostly just the data that they present, as I have no reason to believe any of it is inaccurate, but play close attention to the testing methodology, and then make up my own mind on what picture that paints.
Posted on Reply
#89
kapone32
Nothing new here, the only thing is that there is no 80 class card or those that want to save money. Nvidia is so full of hubris that I expect that this card will be $2499 US and have DLSS 4.0 with AI enhancements and reviewers will make the 4090 seem irrelevant, just like they did with the 3090. Of course the BIAS will be there if AMD releases a card where the 800 series is faster than the 4090, the narrative will praise DLSS 4.0 as a reason to get it. You will get 40 FPS in Path Tracing in CP2077 as that will be the Game they optimize the most for the next card.

There will be those who will quote the narrative that are going to justify this and the truth will be ignored again. What is the truth, Unless you are running a 4K 240Hz monitor. Regular raster is still the foundation of PC Gaming. Someone mentioned that now the Everspace 2 supports DLSS that it runs better. The truth is that on my system at 4K I am consistently in the 150-160 FPS range and that Game is not new, not even last year when it released I was already finished . As my monitor is 144Hz I don't need to spend the cost of the PC to buy a GPU that comes with Nvidia features.

Where I thank Nvidia is making AMD really try. As a result I no longer use MSI Afterburner for monitoring as AMD software has more settings to even show you micro stutter in real time.
wolfHard disagree that they're pro anyone, they just accurately cover product and feature pros and cons as those products and features release or evolve. So recently they certainly throw constant praise at the 7800X3D, but called out the Ryzen 5700 for being misleadingly named and poorly performing. They throw limited (and from my perspective, begrudging) praise at DLSS, becuase the testing bears out it's superiority, yet rightly call out the RTX 3050 6GB as being misleadingly named and poorly performing.

Over the many years of following them and watching / reading (on techspot) their content, I can't accept or agree that they are pro any company overarchingly, but (mostly) Steve's personal style is such that you could wear whatever tint glasses you want when watching and glean from it that they are anti that color tint. Poor Steve Walton allows himself to get a bit too salty and baited by the fanboys at times, as evidenced by his "I told you so" commentary directly addressing them, and my aforementioned example of tests purposely constructed by him, to make his point du jour.

Personally I consume mostly just the data that they present, as I have no reason to believe any of it is inaccurate, but play close attention to the testing methodology, and then make up my own mind on what picture that paints.
The thing that changed my perspective for HUB was the debacle of the 3090 launch. When HUB originally reviewed the 3090 they listed Frame Gen as a major con. They also mentioned that the 6900XT was 15% slower for 1/2 the money. He then went on Vacation and during that Nvidia had his channel comprimised. He then came back and bashed Nvidia but when the AIB 3090s started coming out Frame Gen became a reason to get them. That translated across the entire community though with the exception of a few like Wendell at Level 1.
Posted on Reply
#90
wolf
Performance Enthusiast
kapone32The thing that changed my perspective for HUB was the debacle of the 3090 launch. When HUB originally reviewed the 3090 they listed Frame Gen as a major con. They also mentioned that the 6900XT was 15% slower for 1/2 the money. He then went on Vacation and during that Nvidia had his channel comprimised. He then came back and bashed Nvidia but when the AIB 3090s started coming out Frame Gen became a reason to get them. That translated across the entire community though with the exception of a few like Wendell at Level 1.
I'm a bit lost... Frame Gen launched with the 4090 not the 3090 and the timing of what you're saying is all around 30 series / RDNA 2?
Posted on Reply
#91
kapone32
wolfI'm a bit lost... Frame Gen launched with the 4090 not the 3090 and the timing of what you're saying is all around 30 series / RDNA 2?
I guess it was DLSS then. I was not 100% sure but I know it was one of Nvidia's siloed features.
Posted on Reply
#92
wolf
Performance Enthusiast
kapone32I guess it was DLSS then. I was not 100% sure but I know it was one of Nvidia's siloed features.
Makes more sense, 30 series launched not long after DLSS2, which was hot off the heels of the straight garbage DLSS 1 and took some time to be included in games and prove itself and have the user base sing it's praises.
Posted on Reply
#93
GodisanAtheist
Thinking the 5090 is going to be at least 2 cuts below the full GB202.

We'll get 30% more performance from a deep cut GB202 for consumers, a shallow cut professional card, and then the full die for the enterprise AI market.

Like AD102 the gaming market is never going to see the full card. Even if AMD was somehow competitive there is just too much damn money to be made in the professional market.

Actually how funny would it be if AMD finally got a solid win on NV with RDNA5 and NV is like "whatever" and makes a trillion dollars selling everyone AI.
Posted on Reply
#94
Lycanwolfen
Bet ya the price will be 2999.00 Or 3 grand.

5 grand in canada.

Ya no thanks
Posted on Reply
#95
Super Firm Tofu
GodisanAtheistActually how funny would it be if AMD finally got a solid win on NV with RDNA5 and NV is like "whatever" and makes a trillion dollars selling everyone AI.
I'd be the first in line for RDNA5 (or 4) if it has some great feature (hardware or software) that was a must have. Unfortunately, for whatever reason, AMD is content with good enough in raster, for a tiny percentage less money, and giving everybody 'we have Nvidia at home' style copies of Nvida-first tech.

For the 5090, my guess is 25-30% more performance for $1599 again.
Posted on Reply
#97
wolf
Performance Enthusiast
Minus InfinityYou been in a coma I see: FSR 3.1 announced 6 weeks ago.

www.anandtech.com/show/21317/amd-announces-fsr-31-seriously-improved-upscaling-quality
And already tested by Hardware unboxed that usually loathe FSR and even they were impressed with the improvements
Did you watch the video? 3.1 is announced but not available to consumers in any games yet, and HUB didn't test it at all, they just covered the announcement content provided by AMD and showed older FSR footage.

When it actually gets independently tested and reviewed by the likes of HUB or DF, and users can test it for themselves, we'll get a true sense of the improvements.
Posted on Reply
#98
Neo_Morpheus
AGlezBThat's not my experience so I was wondering where that came from. Do you mind sharing the links?
In my particular observation, they only criticize Ngreedia in pricing and not all the time. Besides that, they are simply perfect.

But then proceed to trash AMD on anything that it’s related to their gpus. No matter what, none of their AMD videos have a positive thumbnail.

And Tim is even worse than Steve. In his eyes, Ngreedia does no wrong. He worships DLSS but will never say anything like “anti consumer tech that limits your options” or anything like that.

I personally stopped watching their videos.
Posted on Reply
#99
64K
There's an article over on videocardz that reports a reliable leaker (kopite) has said that the 5080 should launch first this year. The article is tagged as rumor though so it's still up in the air what will come this year.
Posted on Reply
#100
wolf
Performance Enthusiast
Neo_MorpheusNo matter what, none of their AMD videos have a positive thumbnail.
Simply wrong, didn't take long to find these at all. And more for positives on their CPU's

Posted on Reply
Add your own comment
Jun 2nd, 2024 17:27 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts