Sunday, December 24th 2023

NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024

NVIDIA's next-generation GeForce RTX 50-series "Blackwell" gaming GPUs are on course to debut toward the end of 2024, with a Moore's Law is Dead report pinning the launch to Q4-2024. This is an easy to predict timeline, as every GeForce RTX generation tends to have 2 years of market presence, with the RTX 40-series "Ada" having debuted in Q4-2022 (October 2022), and the RTX 30-series "Ampere" in late-Q3 2020 (September 2020).

NVIDIA's roadmap for 2024 sees a Q1 debut of the RTX 40-series SUPER, with three high-end SKUs refreshing the upper half of the RTX 40-series. The MLID report goes on to speculate that the generational performance uplift of "Blackwell" over "Ada" will be smaller still, than that of "Ada" over "Ampere." With AI HPC GPUs outselling gaming GPUs by 5:1 in terms of revenues, and AMD rumored to be retreating from the enthusiast segment for its next-gen RDNA4, we get to see why this is the case.
Source: Moore's Law is Dead (YouTube)
Add your own comment

126 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024

#26
Prima.Vera
I'm just going to say this one time.
There is NO game in the world that deserves that a user to pay more than a 1000$ for a Video Card. There is NONE.
I challenge anyone to say otherwise.
Posted on Reply
#27
The Quim Reaper
Whatever date they say, you can add on at least 3mths before you'll actually be able to buy one without sitting at a keyboard, hammering F5 all day.
Posted on Reply
#28
Dristun
Prima.VeraI'm just going to say this one time.
There is NO game in the world that deserves that a user to pay more than a 1000$ for a Video Card. There is NONE.
I challenge anyone to say otherwise.
Alan Wake 2 looked pretty cool though :D And while I was playing it the thought that refused to go out of my head is how cool it would be to play it without DLSS. I just can't unsee the blurs and smears. People say native looks more aliased and things like tree leaves or power lines flicker and so on - yeah so what, with DLSS to this day it looks like parts of the frame were rendered on a PS3 and then mixed in with pitch perfect parts around them!
Posted on Reply
#29
gffermari
Prima.VeraI'm just going to say this one time.
There is NO game in the world that deserves that a user to pay more than a 1000$ for a Video Card. There is NONE.
I challenge anyone to say otherwise.
A game? No.
What about gaming, encoding/decoding, processing with CUDA etc.?
Posted on Reply
#30
Firedrops
The only people who cares at all what MLID says are those who, ironically, haven't been actually listening to him since about 2021 or so. And/or those with amnesia.
Posted on Reply
#31
Onyx Turbine
Current setup 1660 limit hurts (when i wanted to play starfield definitely, but i does not yet convince me) but not enough to not wait for q4 2024 and see what a 5060 or 5060 ti could do.
If a 5060 performs close to a 4070 and a 5060ti might even slightly more against even lower power consumption, i see these
cards than being able to age well for mid range pc builts. The cherry on the cake on would be the ability to pair it the latest Q1 2025 with new bread and butter intel cpu.

Buying a 4060 doesnt convince me that much as prices lower(price gap between 4060-4060ti) i calculated that a 4060ti becomes more attractive choice,
but probably not again when waiting for the 5060/5060ti. Imho nvidia succeeded with the 4060 to make a power consumption effective card
with enough performance for 1080 p but a bit to little for what comes in the future and ability to choose 1440p.

If money is not really an issue for a responsible mid range built, i think a great value card atm would be the 4070 super

With ever increasing power prices on its way towards touching 0,5€!! a kwh a 5060 series with even lower idle power consumption as well in youtube is
is wished for and dont laugh but a tdp of 100 Watt where every bit of drop of performance is squeezed out.
Posted on Reply
#32
Lycanwolfen
Have not been impressed with all the newer cards. A.I. seems to be the selling point now and rescaling. I'm still at the 2160p which nvidia said with the 1000 series cards was the way to go. I went 4k with twin 1080ti's in SLI and still it runs great. All the newer card need software to make the same result my pure hardware can do. GTX cards was the last of pure hardware. RTX is ray trace software gimics. Yes DLSS is impressive but it cuts corners. Take off all the DLSS and gimics and run 4k and you see the true power of the 4090. It is not impressive at all.

Also todays gamers are like oooh the game does not have an SLI option so guess it not supported. Hog wash. Most games in the past never had an option but still supported it. Make the changes in the driver not the game duh.
Posted on Reply
#33
R0H1T
Of course it's MLID :laugh:
Posted on Reply
#34
scottslayer
MLID has "leaked" this from Twitter and wherever else like 10 times at this point, give it a rest.
Posted on Reply
#35
gffermari
LycanwolfenHave not been impressed with all the newer cards. A.I. seems to be the selling point now and rescaling. I'm still at the 2160p which nvidia said with the 1000 series cards was the way to go. I went 4k with twin 1080ti's in SLI and still it runs great. All the newer card need software to make the same result my pure hardware can do. GTX cards was the last of pure hardware. RTX is ray trace software gimics. Yes DLSS is impressive but it cuts corners. Take off all the DLSS and gimics and run 4k and you see the true power of the 4090. It is not impressive at all.

Also todays gamers are like oooh the game does not have an SLI option so guess it not supported. Hog wash. Most games in the past never had an option but still supported it. Make the changes in the driver not the game duh.
If you don't cut corners, DLSS+FG+...+...+..+.., you'll get 4090 native performance in 25 years.
Posted on Reply
#36
Nordic
cosminmcmTo me it's the other way around, it's like they never come. If you upgrade every 2 generations, you have a looong time to wait.
I used to upgrade every 2 generations, but the best was never good enough for me then. Now I upgrade from old mid-range card to new mid-range card when I need the performance.
Posted on Reply
#37
Lycanwolfen
gffermariIf you don't cut corners, DLSS+FG+...+...+..+.., you'll get 4090 native performance in 25 years.
Or just go back in time to when cards had power and less wattage.
Posted on Reply
#38
Vayra86
OnasiIt is still the future in the absolute perspective, at some point rasterization will just straight up run out of potential improvements and rendering cheats that can be applied. However, we aren’t talking 2025 or even 2030. People who fell for NVidias hype and genuinely thought that GPUs that are capable of full RTRT rendering with the level of fidelity expected from modern high budget titles are just around the corner were silly. We are going to have a decade at least of the hybrid approach where RT is used for certain effects only at a steadily decreasing cost (hopefully). And this isn’t even talking about full fat Path Tracing which is significantly more expensive.
I don't think it'll materialize in its current form or implementation.

At best it will be on the sidelines as yet another graphical effect you can apply. Hybrid, indeed, but I think the economical reality for developers is that there's too much effort in it. Raster beats it in every way: proven technology, universal hardware support, biggest target market, easy to scale to a wide range of performance/configs, etc etc. The upside of RT just doesn't weigh in enough to offset all of that; a minor graphical upgrade, that can in places be met through raster effects, and is not essential elsewhere for a decent gaming experience. It makes much more sense to add it as a bonus/photo mode-category option to keep incentivizing people to get to a platform where they can use it/upsell GPUs and game teasers indefinitely, and only use it scarcely in games/sequences where it really adds to the experience.

In short, RT works best as a marketing tool. Even Nvidia is NOT on a trajectory with its GPUs to really get RT going bigtime; they skimp on VRAM which RT needs a bit more of; and they offered the biggest perf uplift this gen only on a top end GPU that clearly isn't positioned to 'saturate' the market. They do that instead with a 4070 that has 12GB and can barely run today's games with all bells and whistles at 1440p; while below that they have a totally retarded 4060 with 8GB that can't even dream of going there at 1080p.
Posted on Reply
#39
R0H1T
Well it's a selling point/upsell for sure, even the $3 trillion fruity loops has joined the bandwagon & they're probably the epitome of not adopting such features o_O
Posted on Reply
#40
Vayra86
gffermariIf you don't cut corners, DLSS+FG+...+...+..+.., you'll get 4090 native performance in 25 years.
But you don't really need it for a game to look good. Games have looked great for a decade now, and ran on much lesser hardware.

We're being taken for fools. All these technologies are built to hide stagnation in hardware specs. It started with the 'doubling of shader cores' with Nvidia. Of course, it enabled progress. But we didn't really get 10k shaders there all of a sudden. They just looked differently, stuff was taken out and placed elsewhere. Similarly, VRAM; has regressed relative to the performance uplift in the GPU core. That's that semiconductor magic for ya.

Every GPU gen is simply yet another iterative step of improvements. There is little ground breaking going on, it progresses smoothly from one gen into the next, taking multiple advances in technology in its stride. There's absolutely nothing special about a 4090's performance. Now and ten years ago, if you want games to look good, you need developer TLC, not a great graphics card. In that sense, stagnation in GPU hardware is not a bad thing considering where we are right now. It just means you don't have to break another grand on a GPU two years from now.
Prima.VeraI'm just going to say this one time.
There is NO game in the world that deserves that a user to pay more than a 1000$ for a Video Card. There is NONE.
I challenge anyone to say otherwise.
Sounds like a moving target, while I agree. But inflation, bruh.

Also, I never considered that limit to be 1K ten years ago... More like $500, and I held that target up to and including Pascal.
Posted on Reply
#41
Assimilator
MLID leaking poo from their incontinent anus again, in past years it would be a nurse cleaning grandpa up yet again, in the 2020s it's a source trusted by idiots.
Posted on Reply
#42
efikkan
ReadlightSeparate AI from games.
I can't catch inflation. Double price's higher in 10 years.
At least, and as I've been saying for a while; this "AI" hype eventually lead to someone designing cheap ASICs for specific purposes, meaning the end for these "AI"-tuned GPUs in the long run.
Nvidia would be foolish to focus their development effort on this market, especially since they soon will have numerous of competitors there.
BTW, wasn't' there recently some news about AMD focusing on this too?
DavenExcept that Arrow Lake will have slower CPU performance than Raptor Lake.
So, you know that from having access to a QS of Arrow Lake? ;)
(I think that is a bit early, if you ask me)
But when real leaks happen (not from nobodies pulling stuff out of thin air), there is usually something substantive to go along with it. Getting a glimpse of the final engineering samples is always exciting, and we might get that a couple of months ahead of release, if we're lucky.
Posted on Reply
#43
Onyx Turbine
efikkanAt least, and as I've been saying for a while; this "AI" hype eventually lead to someone designing cheap ASICs for specific purposes, meaning the end for these "AI"-tuned GPUs in the long run.
Nvidia would be foolish to focus their development effort on this market, especially since they soon will have numerous of competitors there.
BTW, wasn't' there recently some news about AMD focusing on this too?


So, you know that from having access to a QS of Arrow Lake? ;)
(I think that is a bit early, if you ask me)
But when real leaks happen (not from nobodies pulling stuff out of thin air), there is usually something substantive to go along with it. Getting a glimpse of the final engineering samples is always exciting, and we might get that a couple of months ahead of release, if we're lucky.
when having the patience waiting for a new built, do you anticipate the entry level arrow lake cpus being worth the wait? Compared to a 14100/14400
Or will this be a negligble uplift in performance
Posted on Reply
#44
efikkan
Chippendalewhen having the patience waiting for a new built, do you anticipate the entry level arrow lake cpus being worth the wait? Compared to a 14100/14400
Or will this be a negligble uplift in performance
I don't know the performance or pricing for Arrow Lake, nor do I have any kind of "sources". No one will truly know before final samples arrive this summer(?) or fall. My understanding is that it will be a mid-to-larger architectural upgrade, but that will not give us any specific figures.

But speaking from following the PC market for over 20 years, if you're building an "entry level" desktop, and your constraint is primarily cost (not energy efficiency or special features of a new platform), then you should buy whenever you feel the "need", not wait for a better deal.

Whenever a new platform launches, it usually takes a while before cheaper (but decent) motherboards arrive, so in terms of value you're probably going to find a better deal before that. I generally advice value buyers to buy the best deal available when they feel the "need" for a PC. You never know when good deals arrive, and these days it seems to be far between.

In the past I've often hunted parts for months for a build, but I don't think there's much to save from that any more, and it quickly takes forever waiting for a part to get cheaper, and it might actually never get cheaper once you're locked into certain parts. For people who can cover the cost with their monthly income, I rather advice to buy it all together when needed, otherwise put the money into managed funds in these inflationary times. (you will not get any "yields"(literal or otherwise) from capital bound up in a partly built PC ;) )
Posted on Reply
#45
xSneak
Prima.VeraI'm just going to say this one time.
There is NO game in the world that deserves that a user to pay more than a 1000$ for a Video Card. There is NONE.
I challenge anyone to say otherwise.
CyberPunk with path tracing. Easily.
Posted on Reply
#46
Ownedtbh
Just put gaming capabilitys into AI cards, some of us would love to have 6+ games open at the same time.
Posted on Reply
#47
Gica
matarNo competition = higher prices over already high prices
They will keep the same boost for the king 5090 and queen 5080. Here, the performance jumps were always big. For the rest, let's look at the half of the glass: you will have fewer reasons to change the video card if the performance increase is pathetic.
Posted on Reply
#48
Beginner Macro Device
Dirt ChipAll it takes is a 4080 pref level with 4090 mem GB, 4070 W and 4070ti price.
You just invented RX 7900 XTX. Well, with 1.7x that power consumption.
xSneakThe advancement we have seen from the 1080 Ti to the 4090 has been legendary, much like the jump from 2d to 3d.
If we throw ~2000 USD price tag outta the window then yes, it's drastic. Yet 4070 Ti is the closest $ to $ successor, just a tad cheaper than 1080 Ti was (inflation considered). Not that impressive, huh?
Posted on Reply
#49
HOkay
LycanwolfenHave not been impressed with all the newer cards. A.I. seems to be the selling point now and rescaling. I'm still at the 2160p which nvidia said with the 1000 series cards was the way to go. I went 4k with twin 1080ti's in SLI and still it runs great. All the newer card need software to make the same result my pure hardware can do. GTX cards was the last of pure hardware. RTX is ray trace software gimics. Yes DLSS is impressive but it cuts corners. Take off all the DLSS and gimics and run 4k and you see the true power of the 4090. It is not impressive at all.

Also todays gamers are like oooh the game does not have an SLI option so guess it not supported. Hog wash. Most games in the past never had an option but still supported it. Make the changes in the driver not the game duh.
I enjoyed my 1080ti SLI days. Though it was by no means plain sailing, most games you could get working pretty well, but frame pacing was usually not perfect. I remember moving to a single 2080ti & being disappointed since in the best case scenarios for the 1080tis they'd smoke the 2080ti. But moving to the 3080 was great & even the best case scenarios for 2x1080ti wouldn't beat the 3080. They could still get close though! But then the 4090. Sure they're kinda cheating by using crazy wattages, but being an ex-1080ti SLI guy that wasn't a problem in my eyes at all, & you get very close to double the performance of a 3080! Which then means of course 2x1080ti best case scenarios are still only half the frame rate of a 4090 too. So in three generations we've ended up with ~4x the performance, albeit at a very high power increase & price. I doubt we'll see a 4x performance increase in the next three gens sadly :(
Posted on Reply
#50
lexluthermiester
What I would like to see from the RTX 5000 series is more refined efficiency and thus shorter/smaller/thinner cards and lower TDP's(the stacked dies would help there if the rumors are correct), a move back to standard PCIe power connectors, jumping to a VRAM standard or 12GB, 16GB and 24GB/32GB(leaving the 8GB for the lower tiers), more user customizations and controls in the VBIOS and finally, better support for small-form-factor sector of PCs.
Posted on Reply
Add your own comment
May 16th, 2024 03:24 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts