Sunday, December 24th 2023
NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024
NVIDIA's next-generation GeForce RTX 50-series "Blackwell" gaming GPUs are on course to debut toward the end of 2024, with a Moore's Law is Dead report pinning the launch to Q4-2024. This is an easy to predict timeline, as every GeForce RTX generation tends to have 2 years of market presence, with the RTX 40-series "Ada" having debuted in Q4-2022 (October 2022), and the RTX 30-series "Ampere" in late-Q3 2020 (September 2020).
NVIDIA's roadmap for 2024 sees a Q1 debut of the RTX 40-series SUPER, with three high-end SKUs refreshing the upper half of the RTX 40-series. The MLID report goes on to speculate that the generational performance uplift of "Blackwell" over "Ada" will be smaller still, than that of "Ada" over "Ampere." With AI HPC GPUs outselling gaming GPUs by 5:1 in terms of revenues, and AMD rumored to be retreating from the enthusiast segment for its next-gen RDNA4, we get to see why this is the case.
Source:
Moore's Law is Dead (YouTube)
NVIDIA's roadmap for 2024 sees a Q1 debut of the RTX 40-series SUPER, with three high-end SKUs refreshing the upper half of the RTX 40-series. The MLID report goes on to speculate that the generational performance uplift of "Blackwell" over "Ada" will be smaller still, than that of "Ada" over "Ampere." With AI HPC GPUs outselling gaming GPUs by 5:1 in terms of revenues, and AMD rumored to be retreating from the enthusiast segment for its next-gen RDNA4, we get to see why this is the case.
126 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024
There is NO game in the world that deserves that a user to pay more than a 1000$ for a Video Card. There is NONE.
I challenge anyone to say otherwise.
What about gaming, encoding/decoding, processing with CUDA etc.?
If a 5060 performs close to a 4070 and a 5060ti might even slightly more against even lower power consumption, i see these
cards than being able to age well for mid range pc builts. The cherry on the cake on would be the ability to pair it the latest Q1 2025 with new bread and butter intel cpu.
Buying a 4060 doesnt convince me that much as prices lower(price gap between 4060-4060ti) i calculated that a 4060ti becomes more attractive choice,
but probably not again when waiting for the 5060/5060ti. Imho nvidia succeeded with the 4060 to make a power consumption effective card
with enough performance for 1080 p but a bit to little for what comes in the future and ability to choose 1440p.
If money is not really an issue for a responsible mid range built, i think a great value card atm would be the 4070 super
With ever increasing power prices on its way towards touching 0,5€!! a kwh a 5060 series with even lower idle power consumption as well in youtube is
is wished for and dont laugh but a tdp of 100 Watt where every bit of drop of performance is squeezed out.
Also todays gamers are like oooh the game does not have an SLI option so guess it not supported. Hog wash. Most games in the past never had an option but still supported it. Make the changes in the driver not the game duh.
At best it will be on the sidelines as yet another graphical effect you can apply. Hybrid, indeed, but I think the economical reality for developers is that there's too much effort in it. Raster beats it in every way: proven technology, universal hardware support, biggest target market, easy to scale to a wide range of performance/configs, etc etc. The upside of RT just doesn't weigh in enough to offset all of that; a minor graphical upgrade, that can in places be met through raster effects, and is not essential elsewhere for a decent gaming experience. It makes much more sense to add it as a bonus/photo mode-category option to keep incentivizing people to get to a platform where they can use it/upsell GPUs and game teasers indefinitely, and only use it scarcely in games/sequences where it really adds to the experience.
In short, RT works best as a marketing tool. Even Nvidia is NOT on a trajectory with its GPUs to really get RT going bigtime; they skimp on VRAM which RT needs a bit more of; and they offered the biggest perf uplift this gen only on a top end GPU that clearly isn't positioned to 'saturate' the market. They do that instead with a 4070 that has 12GB and can barely run today's games with all bells and whistles at 1440p; while below that they have a totally retarded 4060 with 8GB that can't even dream of going there at 1080p.
We're being taken for fools. All these technologies are built to hide stagnation in hardware specs. It started with the 'doubling of shader cores' with Nvidia. Of course, it enabled progress. But we didn't really get 10k shaders there all of a sudden. They just looked differently, stuff was taken out and placed elsewhere. Similarly, VRAM; has regressed relative to the performance uplift in the GPU core. That's that semiconductor magic for ya.
Every GPU gen is simply yet another iterative step of improvements. There is little ground breaking going on, it progresses smoothly from one gen into the next, taking multiple advances in technology in its stride. There's absolutely nothing special about a 4090's performance. Now and ten years ago, if you want games to look good, you need developer TLC, not a great graphics card. In that sense, stagnation in GPU hardware is not a bad thing considering where we are right now. It just means you don't have to break another grand on a GPU two years from now. Sounds like a moving target, while I agree. But inflation, bruh.
Also, I never considered that limit to be 1K ten years ago... More like $500, and I held that target up to and including Pascal.
Nvidia would be foolish to focus their development effort on this market, especially since they soon will have numerous of competitors there.
BTW, wasn't' there recently some news about AMD focusing on this too? So, you know that from having access to a QS of Arrow Lake? ;)
(I think that is a bit early, if you ask me)
But when real leaks happen (not from nobodies pulling stuff out of thin air), there is usually something substantive to go along with it. Getting a glimpse of the final engineering samples is always exciting, and we might get that a couple of months ahead of release, if we're lucky.
Or will this be a negligble uplift in performance
But speaking from following the PC market for over 20 years, if you're building an "entry level" desktop, and your constraint is primarily cost (not energy efficiency or special features of a new platform), then you should buy whenever you feel the "need", not wait for a better deal.
Whenever a new platform launches, it usually takes a while before cheaper (but decent) motherboards arrive, so in terms of value you're probably going to find a better deal before that. I generally advice value buyers to buy the best deal available when they feel the "need" for a PC. You never know when good deals arrive, and these days it seems to be far between.
In the past I've often hunted parts for months for a build, but I don't think there's much to save from that any more, and it quickly takes forever waiting for a part to get cheaper, and it might actually never get cheaper once you're locked into certain parts. For people who can cover the cost with their monthly income, I rather advice to buy it all together when needed, otherwise put the money into managed funds in these inflationary times. (you will not get any "yields"(literal or otherwise) from capital bound up in a partly built PC ;) )