Sunday, December 24th 2023

NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024

NVIDIA's next-generation GeForce RTX 50-series "Blackwell" gaming GPUs are on course to debut toward the end of 2024, with a Moore's Law is Dead report pinning the launch to Q4-2024. This is an easy to predict timeline, as every GeForce RTX generation tends to have 2 years of market presence, with the RTX 40-series "Ada" having debuted in Q4-2022 (October 2022), and the RTX 30-series "Ampere" in late-Q3 2020 (September 2020).

NVIDIA's roadmap for 2024 sees a Q1 debut of the RTX 40-series SUPER, with three high-end SKUs refreshing the upper half of the RTX 40-series. The MLID report goes on to speculate that the generational performance uplift of "Blackwell" over "Ada" will be smaller still, than that of "Ada" over "Ampere." With AI HPC GPUs outselling gaming GPUs by 5:1 in terms of revenues, and AMD rumored to be retreating from the enthusiast segment for its next-gen RDNA4, we get to see why this is the case.
Source: Moore's Law is Dead (YouTube)
Add your own comment

126 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024

#51
stimpy88
I really hope the AI bubble bursts before these come out, or we will see "gamer" card prices like we have never seen before! The AI industry hates nGreedia, but until a viable platform alternative comes, we are at nGreedias mercy, something they are not famous for!
Posted on Reply
#52
RamiHaidafy
It's possible that the rumor AMD isn't making a high-end RDNA 4 card is only half true. It's likely that they aren't making a high-end chip (Navi 41), but that doesn't mean they won't combine mid-range Navi 42/43 chips to make a high-end card.

This approach is cheaper for AMD, and can result in high-end performance.

AMD started with chiplet GPUs with RDNA 3, and I think it's only going to continue.
Posted on Reply
#53
efikkan
FahadIt's possible that the rumor AMD isn't making a high-end RDNA 4 card is only half true. It's likely that they aren't making a high-end chip (Navi 41), but that doesn't mean they won't combine mid-range Navi 42/43 chips to make a high-end card.

This approach is cheaper for AMD, and can result in high-end performance.

AMD started with chiplet GPUs with RDNA 3, and I think it's only going to continue.
Wasn't the main point of chiplets to be able to easily scale the design and fill the entire market?
And can you elaborate what you mean by combining mid-range chips?

Does anyone know where that rumor comes from? Because if it doesn't come from a trusted source, we should ignore it.
Posted on Reply
#54
RamiHaidafy
efikkanWasn't the main point of chiplets to be able to easily scale the design and fill the entire market?
And can you elaborate what you mean by combining mid-range chips?

Does anyone know where that rumor comes from? Because if it doesn't come from a trusted source, we should ignore it.
Right now, the high-end RDNA cards are comprised of one Graphics Complex Die (GCD), and multiple Memory Complex Dies (MCD) on the same package.



It's possible that with RDNA 4, AMD might combine more than one GCD on the package. My guess is that they would be smaller, mid-range GCDs rather one big, expensive, high-end GCD. This should be cheaper to make, less risky considering yeilds and defects, and at the same time, allow AMD to reach enthusiast class performance.

So just because the rumor is that AMD wont make a high-end RDNA 4 chip/GCD, doesnt mean they wont make a high-end RDNA 4 card.
Posted on Reply
#55
FiRe
Space LynxNot going to lie, if I was rich a 2nm Arrow Lake and RTX 5090 would be epic as fuck combo. lol

Considering the price I paid for my current rig though, $200 cpu and $110 mobo, and $705 gpu on sale... meh. I am happy where I am.

If someone hires me with a decent salary next year though I might considering selling my current rig and getting my dream Ultima 5090 and 2nm Arrow Lake combo in Winter 2024.
lol
Posted on Reply
#56
RayneYoruka
matarNo competition = higher prices over already high prices
Facts
Posted on Reply
#57
mechtech
meh

RX6600 for $250CAD (180USD) and turn down graphics settings and resolution.
Posted on Reply
#58
dyonoctis
efikkanAt least, and as I've been saying for a while; this "AI" hype eventually lead to someone designing cheap ASICs for specific purposes, meaning the end for these "AI"-tuned GPUs in the long run.
Nvidia would be foolish to focus their development effort on this market, especially since they soon will have numerous of competitors there.
BTW, wasn't' there recently some news about AMD focusing on this too?
But A.I is already running on specialized hardware, even on Nvidia GPUs. Jensen doesn't see what he's doing right now as "general computing", on the contrary, the guy said that specialized hardware is the future, he calls what they are doing right now accelerated/heterogeneous computing. blogs.nvidia.com/blog/what-is-accelerated-computing/

Nvidia tensor cores, AMD/Intel NPU are part of the general package. They don't want to make a dedicated AI extension card. I'm seeing a lot of comparisons being made between A.I and mining, but that's not what happening, mainstreams apps are already accelerated by AI-specialised hardware:


blog.adobe.com/en/publish/2023/04/18/denoise-demystified


See how intel/AMD are not mentioned ? They were late to join the game, so they will be late to get equivalent software support. Same old story. Same reason reason why Nvidia still dominates the market even though they are the least-liked company according to what you see on forums.


Also Keep in mind that the A.I specialized GPU (A100/H100) are configured differently compared to the desktop version. The A100 is based on ampere, but if you compare the GA100 to the GA102 used in the 3090, you see that it's not just a bigger die:
GA100:

GA102
Posted on Reply
#59
Dimitriman
Because I love a toxic relationship I can't wait for the 50 series and a lot of gas-lighting and emotional abuse from Nvidia /s
Posted on Reply
#60
Gooigi's Ex
AND ITS ALL AMD’s FAULT *cries in spanish*
Posted on Reply
#61
PapaTaipei
EdInkNow I can afford the previous generation
Not even the case as the prices didnt drop AT ALL despite what the hardware news websites and youtubers are saying. A 2 years old GPU is still selling at the same higher price than MSRP as of now.
Posted on Reply
#62
QUANTUMPHYSICS
I hope ya'll been buying Nvidia and AMD stock (as well as Micron and Intel)

I've earned so much since 2020, I could buy several of these overpriced monstrosity cards.
Posted on Reply
#63
Divide Overflow
Without competition from AMD, bend over, grab your ankles and bite the pillow.
Posted on Reply
#64
xSneak
Beginner Micro DeviceYou just invented RX 7900 XTX. Well, with 1.7x that power consumption.


If we throw ~2000 USD price tag outta the window then yes, it's drastic. Yet 4070 Ti is the closest $ to $ successor, just a tad cheaper than 1080 Ti was (inflation considered). Not that impressive, huh?
A 4070 Ti completly blows a 1080 Ti out of the water in terms of performance and features. The quality of video encoding on the newer nvdia gpus alone is worth the upgrade, let alone all the rtx / dlss features... :laugh:
Posted on Reply
#65
Beginner Macro Device
xSneakA 4070 Ti completly blows a 1080 Ti out of the water in terms of performance and features.
So does 1080 Ti compared to GTX 580 (same 6 year difference and an even bigger difference in terms of speed and efficiency). It's normal for newer GPUs to be much faster and more efficient than older ones. Why you are so impressed, I don't know.
Posted on Reply
#66
efikkan
DimitrimanBecause I love a toxic relationship I can't wait for the 50 series and a lot of gas-lighting and emotional abuse from Nvidia /s
I hope you think of it as entertainment, because if you do, each product cycle can be quite exciting. (especially if you go back and review the "leakers" in hindsight, then it's obvious who is full of manure.)

But we should expect there to be a lot of BS whenever a new product arrives; cherry-picked benchmarks, new pointless gimmicks (usually some new nonsense AA-technique), etc.
Then there is the overreaction from all the fanboys from the other team.
And then 2-3 years later, people finally acknowledge it was actually a good product. (like with the 1080 Ti, 2000 series, etc.)
QUANTUMPHYSICSI hope ya'll been buying Nvidia and AMD stock (as well as Micron and Intel)

I've earned so much since 2020, I could buy several of these overpriced monstrosity cards.
Remember, you haven't really earned anything until the investment is realized. ;)
If you're smart, you gradually sell these overpriced stocks and move the investment into managed funds with a solid track record, so you can continue growing your wealth for years to come. Don't wait until see where the top of single stocks are, because then it's too late. :)
Posted on Reply
#67
Dawora
FourstaffBased on current demand and foundry prices I will not be surprised if 5 series comes in with another massive price hike.
its allways Price/perf ratio.
NEVER price alone!

i dont care naming.
If they release RTX5030 and its 700$ but faster than Rtx4090 then its better price/perf.

So its not Price hike alone, its also performance hike.
Pls.. dont be stupid.
Posted on Reply
#68
Awwwyeahhhbaby
NordicAm I just getting older or do these GPU releases seem to come faster?
It's a phenomenon that's easy to visualize as a graph, essentially the older you get the less each year represents a portion of your life and time seems to "go faster' than before.
Posted on Reply
#69
THU31
They should only release mid-range GPUs for gamers, keep the high-end exclusively for AI and other professional uses.

Just do a 5060 and 5070 for $300 and $500. Two SKUs would not tie up the production lines, they could make tons of AI GPUs for thousands of dollars. If they do a 5090 with 32 GB and 450+ W, it's just stupid. Who needs that?
Posted on Reply
#70
Melvis
Who cares? No one could afford the 4000 let alone these 5000 GPU's......So sure bring it on! and let us all watch and sigh and dream on.
Posted on Reply
#71
ThrashZone
Hi,
Not sure about not affording
Think it's more priority clash plus why give nvidia so much money when there is so much better hobbies that live longer than gpu's lol
Same goes for intel's tomanylakes :laugh:
Posted on Reply
#72
stimpy88
ThrashZoneHi,
Not sure about not affording
Think it's more priority clash plus why give nvidia so much money when there is so much better hobbies that live longer than gpu's lol
Same goes for intel's tomanylakes :laugh:
Well, I couldn't justify spending £1200 to £2200 on the only two 40x0 series cards that make any kind of sense. Even the 4080 is a bad buy because it's so slow in modern games @4k. The gap between it and the 4090 is just too wide, if the 4080 had been 20% faster, I would have pulled the trigger. The 40x0 series is the worst value series nGreedia have ever released. I don't think their shenanigans will stop with the 50x0 series.

I know a lot of people that felt the same, and are holding on for the 50x0 series instead, as hopefully, even the lower range cards will offer some meaningful improvements over the 20x0 and 30x0 series cards everyone still use.
Posted on Reply
#73
Prima.Vera
stimpy88I really hope the AI bubble bursts before these come out, or we will see "gamer" card prices like we have never seen before! The AI industry hates nGreedia, but until a viable platform alternative comes, we are at nGreedias mercy, something they are not famous for!
AI is here to stay. I foresee that toilet seats will soon have AI here in Japan...
Posted on Reply
#74
stimpy88
Prima.VeraAI is here to stay. I foresee that toilet seats will soon have AI here in Japan...
Hahaha, Oh I'm sure that A.I. is going to continuously be plastered all over all sorts of products that have nothing to do with it!

I was more thinking along the lines of nGreedia's strangle hold on it, and most of the major players have been announcing their own chips instead of using NV chips.
Posted on Reply
#75
gffermari
stimpy88Well, I couldn't justify spending £1200 to £2200 on the only two 40x0 series cards that make any kind of sense. Even the 4080 is a bad buy because it's so slow in modern games @4k. The gap between it and the 4090 is just too wide, if the 4080 had been 20% faster, I would have pulled the trigger. The 40x0 series is the worst value series nGreedia have ever released. I don't think their shenanigans will stop with the 50x0 series.

I know a lot of people that felt the same, and are holding on for the 50x0 series instead, as hopefully, even the lower range cards will offer some meaningful improvements over the 20x0 and 30x0 series cards everyone still use.
....but the 4080 is a 1440p card. Not a 4K one.

The 2000 series was the worst value lineup ever released. There was no use case to take the most out of them. They had way more value later (DLSS) rather than when they were released.
Posted on Reply
Add your own comment
May 16th, 2024 02:58 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts