Sunday, December 24th 2023

NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024

NVIDIA's next-generation GeForce RTX 50-series "Blackwell" gaming GPUs are on course to debut toward the end of 2024, with a Moore's Law is Dead report pinning the launch to Q4-2024. This is an easy to predict timeline, as every GeForce RTX generation tends to have 2 years of market presence, with the RTX 40-series "Ada" having debuted in Q4-2022 (October 2022), and the RTX 30-series "Ampere" in late-Q3 2020 (September 2020).

NVIDIA's roadmap for 2024 sees a Q1 debut of the RTX 40-series SUPER, with three high-end SKUs refreshing the upper half of the RTX 40-series. The MLID report goes on to speculate that the generational performance uplift of "Blackwell" over "Ada" will be smaller still, than that of "Ada" over "Ampere." With AI HPC GPUs outselling gaming GPUs by 5:1 in terms of revenues, and AMD rumored to be retreating from the enthusiast segment for its next-gen RDNA4, we get to see why this is the case.
Source: Moore's Law is Dead (YouTube)
Add your own comment

126 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024

#101
umeng2002
NordicWould you say that the RTX 3090 fails to deliver playable frame rates with frame generation or not?
I don't know since it doesn't have DLSS 3 and I haven't seen any tests with FSR 3 modded into Cyberpunk with Path Tracing on with a 3090. At 1440p, with FSR 3 modded in and using FSR 2, with path tracing on, the 7900 XTX can turn into a slide show.
Posted on Reply
#102
Prima.Vera
Waldorf@Prima.Vera
how about ppl running 1 or 2K, and are switching to 4K?
not everyone centers their purchase on a (single) game...

ignoring that if i still had my old job, the next xx80 with a WB would have been mine,
without having to do a single thing with games...
Except that only the 4090 can be truly called a 4K able card, but even that becomes pretty useless if you activate Ray Tracing.
So still not worth any penny over 800$, and that's me being generous.
Posted on Reply
#103
Nordic
umeng2002Just to reiterate, with path-tracing, there is no competition. The 7900 XTX absolutely fails to deliver playable frame rates even with frame generation.
umeng2002I don't know since it doesn't have DLSS 3 and I haven't seen any tests with FSR 3 modded into Cyberpunk with Path Tracing on with a 3090. At 1440p, with FSR 3 modded in and using FSR 2, with path tracing on, the 7900 XTX can turn into a slide show.
I was asking because people used to say the same thing about the RTX 3090. AMD is only one generation behind Nvidia. The 7900xtx provides ray tracing performance better than a 4070 and between a 3090 and 3090ti on average in most ray tracing games.
(I'm borrowing these images from the latest GPU review. Please ignore the Asrock 7900xt.)



Cyberpunk is one of the more demanding ray tracing games and it favors Nvidia. I do agree the 7900xtx doesn't have playable FPS in Cyberpunk, but only the 4090 does. AMD is just one generation behind. AMD could easily have 4090 level performance in ray tracing next generation.

Posted on Reply
#104
Waldorf
@Prima.Vera
i dont buy cards based on what anyone says "it is" for/should do.
i buy the "best" i can get, for the funds i have, usually avoiding the top model, and rather upgrade every gen,
as long as someone buys my existing card, at least thats what i did for the past 22y.

ignoring the fact that i dont need a 3090 or even 4090 to play MY games, all run fine at 2160/60p "G synced".
and even if i would get a new AA title, nothing prevents me from running it at 1440p, my screen still stays at 2160p...

and value is in the eye of the beholder:
would you buy your "dream" car/bike/house (whatever it is you want, thats "out of reach")
that would cost 2-3 times of your yearly income? probably not.
would you buy it if you had 10 million in your account? almost 100% sure you would, even that the price didnt change.

(its one thing to say "not for me", but wont mean i will "sour" it for others, just because i cant/wont spend the money.
e.g. my friend had not upgraded since the 2080Ti, has nice income from working his ass off, if he doesnt take care of 3(5) kids,
and doesnt spend money on anything else, so he shouldnt get a xx90 because its bad value?)

to be clear:
nothing i said on TPU ever, means im ok with gpu/Nv prices/pricing.
Posted on Reply
#105
Bluewater82
Prima.VeraI'm just going to say this one time.
There is NO game in the world that deserves that a user to pay more than a 1000$ for a Video Card. There is NONE.
I challenge anyone to say otherwise.
I see you do not understand supply and demand. The GPU market is dominated by Nvidia and they can charge however much they want and people continue to buy up their product. To you and a bunch of other people the price isn’t worth it, to hundreds of thousands of others it is. No music or movie is worth millions of dollars to me, but there is a successful market in which high-end home entertainment systems run hundreds of thousands and even millions of dollars. People able to afford 4k monitors are able to afford the hardware required to run them. People who spend thousands of dollars on sim rigs for racing and DCS have no problem spending another thousand on the latest and greatest card that will allow them to run larger crystal clear displays or power VR headsets that require extra oomph to generate high enough frame rates to reduce stutters that destroy immersion and induce nausea.

As with any product out in the world there will be low-, mid-, and high-end offerings. People who are satisfied with low performance are completely free to ignore the high-end stuff but that doesn’t mean those of us with higher incomes and higher standards shouldn’t have the option to buy Lamborghinis while the rest of the peasants putter around in Honda Fits.
Posted on Reply
#106
AusWolf
Bluewater82I see you do not understand supply and demand. The GPU market is dominated by Nvidia and they can charge however much they want and people continue to buy up their product. To you and a bunch of other people the price isn’t worth it, to hundreds of thousands of others it is. No music or movie is worth millions of dollars to me, but there is a successful market in which high-end home entertainment systems run hundreds of thousands and even millions of dollars. People able to afford 4k monitors are able to afford the hardware required to run them. People who spend thousands of dollars on sim rigs for racing and DCS have no problem spending another thousand on the latest and greatest card that will allow them to run larger crystal clear displays or power VR headsets that require extra oomph to generate high enough frame rates to reduce stutters that destroy immersion and induce nausea.

As with any product out in the world there will be low-, mid-, and high-end offerings. People who are satisfied with low performance are completely free to ignore the high-end stuff but that doesn’t mean those of us with higher incomes and higher standards shouldn’t have the option to buy Lamborghinis while the rest of the peasants putter around in Honda Fits.
The only thing I'd add is if a Honda Fit is good enough for you, then by all means, buy a Honda Fit. It gets you from A to B just like any other car. If you want to do track days, on the other hand, and you have the means, then buy a Lamborghini. There's no right or wrong, and neither option will make you a peasant or a king. The things you own do not define you as a person, no matter how much every single company and its mindless fans are trying to tell you otherwise.

My limit for a GPU is around £500, and I'm mostly happy with the 7800 XT I got for that price. If you want more, then buy more. If you want less, then buy less. All I'd recommend is reading reviews, not sitting on any bandwagon and making an informed decision before you buy.
Posted on Reply
#107
umeng2002
If people just set all the game settings to medium, you can save a lot of money. Today, it's ray-tracing killing performance. A few decades ago, it was MSAA. When I got into PC gaming, I just put up with the jaggies.
Posted on Reply
#108
stimpy88
umeng2002If people just set all the game settings to medium, you can save a lot of money. Today, it's ray-tracing killing performance. A few decades ago, it was MSAA. When I got into PC gaming, I just put up with the jaggies.
If you want your game to look like a 10 year old title...
Posted on Reply
#109
AusWolf
stimpy88If you want your game to look like a 10 year old title...
It depends on the game, though. In most modern games, there's barely any difference between high and medium.
Posted on Reply
#110
Onasi
stimpy88If you want your game to look like a 10 year old title...
Let’s not exaggerate, modern games look absolutely fine on medium settings and far better than what they looked a decade ago on Ultra. This obsession with “if you aren’t running Ultra you’re not playing” is ridiculous.
Posted on Reply
#111
Waldorf
@umeng2002/Onasi

I didnt get a 50in UHD screen, to play minecraft @480p. ;)

one of the main reasons i play on the pc is to have "higher/more" res/options (hw) than a console would offer (beside mouse+kb),
and while it might not be visible in every game and all the time, i prefer to run (smartly) maxed graphical settings,
as even in Siege im "static" long enough (PvE defend), to see the difference.
i rather lower res to 1440p, which will also allow for much bigger gains on fps, than lower graphical detail/settings.
Posted on Reply
#112
stimpy88
OnasiLet’s not exaggerate, modern games look absolutely fine on medium settings and far better than what they looked a decade ago on Ultra. This obsession with “if you aren’t running Ultra you’re not playing” is ridiculous.
640k is all you'll ever need and want, aye?
Posted on Reply
#113
Onasi
stimpy88640k is all you'll ever need and want, aye?
Yea, because that is exactly what I said. Surely, there is no middle ground between “Ultra or Death!” and “I only want Wolf 3D graphics”. Good job being reductive.
Posted on Reply
#114
Waldorf
@Onasi
while i assume that post was sarcasm mode: on,
i prefer to argue with folks on the "ultra or nothing" side, as they at least aim for something (as in "above" console (graphics),
rather than the "medium is fine", which isnt really why i (and most here) spend money on "better" hw, than what a console could do.

there are enough sites doing reviews where the difference between min-max is shown,
usually incl what perf hit it comes with.
so if turning down shadows (from Ultra) gives me enough fps while i only lose some shadow detail, fine,
but when it comes to textures/tessellation etc, i dont care for less than high/max, or i dont need a (gaming) pc..
Posted on Reply
#115
Onasi
@Waldorf
That’s all fine and dandy, and I don’t claim to represent anyones opinions but my own. But I do understand that not everyone upgrades often. Not everyone buys the most powerful hardware available. Everyone has different needs and everyone has different financial situations. Not everyone lives in first world countries and has access, much less the means, to top tier hardware at good prices on demand. Not everyone plays or is interested in latest AAA either. I strongly believe that the strongest attribute of PC gaming is options and scalability. There is something for everyone. THAT what makes it superior to consoles. Not the “better graphics” angle. And I truly believe that graphical elitism and a push for Ultra/Maximum/Epic/Unreal/Nightmare/EldritchAbomination settings has poisoned the well around the hobby and potentially turns off some people. I don’t want that. I want the enthusiast community to be as welcoming and as open as it can be.
Posted on Reply
#116
THU31
Waldorf@Onasi
i prefer to argue with folks on the "ultra or nothing" side, as they at least aim for something (as in "above" console (graphics),
rather than the "medium is fine", which isnt really why i (and most here) spend money on "better" hw, than what a console could do.
I'd say that medium is actually fine, usually. To me PC is more about framerate and resolution. So many console games upscale from 720p these days, which is completely unacceptable, and some of them don't even get locked 60 FPS.

What sucks is that a $600 GPU barely gets you 1440p60 (which is derived from how poorly console games run), and that's not even on max settings usually. It's sad when you buy a new card, at what used to be high-end prices not long ago, and you have to compromise quality right away.

But I think the situation will stabilize when the next generation of GPUs comes out. We had the jump to "next-gen" games last year, so progress should slow down for a few years.
Posted on Reply
#117
AusWolf
THU31I'd say that medium is actually fine, usually. To me PC is more about framerate and resolution. So many console games upscale from 720p these days, which is completely unacceptable, and some of them don't even get locked 60 FPS.

What sucks is that a $600 GPU barely gets you 1440p60 (which is derived from how poorly console games run), and that's not even on max settings usually. It's sad when you buy a new card, at what used to be high-end prices not long ago, and you have to compromise quality right away.

But I think the situation will stabilize when the next generation of GPUs comes out. We had the jump to "next-gen" games last year, so progress should slow down for a few years.
That's why I'm enthusiastic about RDNA 4. If the rumours are true, and we get decent mid-range performance for 500 bucks, I'm in. I'm not enthusiastic about Blackwell, though. I'm sure it'll be super fast, but also super expensive, which eliminates me as a potential buyer.
Posted on Reply
#118
Waldorf
@Onasi
the same way you are fine with med settings or prioritizing scaling etc, i should be able to say i want "max/ultra",
without getting funny comments/being required to "defend" this.

to make sure: i never cared what anyone in the game/hardware industry said about "settings", or what i need/have to buy.


and thats still ignoring that most games will look and run better with "high" graphical settings but (one step) lower resolution,
vs native res and med settings, as resolution is the biggest fps eater.

e.g. siege in 1440p looks better on my 50in UHD running maxed settings,
than it ever did with med settings on the 32in (QHD) moni i had before.


@THU31
not saying im fine with Nv pricing/structuring, but my LC 2080S was close to 1000$ (msrp),
and its not anywhere close to the perf of what i can buy now, for half.

short of a bad console port, its also (very) depending on the games you play
so anyone with AAA titles from +4y ago, would easily be able to do 4K,
on the same card you barely get 1440p...

@AusWolf
the same way the rumors for amd can be wrong, they could be (good) for Nv (we still can hope).

but until we see stuff on the shelfs with a price tag, one (unreleased product) isnt "better" than the other.
Posted on Reply
#119
Gica
GTX 960
GTX 1060
RTX 2060
RTX 3070 Ti
In the plan: nothing until 2025.
The reason: the high price paid for the RTX 3070 Ti (2021, the year of mining madness)

My biggest disappointment was the 2600 XT AGP, bought with more than an average economy salary in the very year when Romania joined the EU. Crysis had just been released and I saw with amazement that the video card has problems even with medium settings in 1080p.
For today, that video card is the equivalent of >$1200 compared to the average salary now. That's why we, from the East, look somewhat differently at the evolution of video card prices. Now I don't get a shit for $1200.
Posted on Reply
#120
Prima.Vera
Stay tuned for some more proprietary $hit running only on the 50x0 series, for the bestest and fastest gaming eggsperience.
Posted on Reply
#121
portmaster
GicaGTX 960
GTX 1060
RTX 2060
RTX 3070 Ti
In the plan: nothing until 2025.
The reason: the high price paid for the RTX 3070 Ti (2021, the year of mining madness)

My biggest disappointment was the 2600 XT AGP, bought with more than an average economy salary in the very year when Romania joined the EU. Crysis had just been released and I saw with amazement that the video card has problems even with medium settings in 1080p.
For today, that video card is the equivalent of >$1200 compared to the average salary now. That's why we, from the East, look somewhat differently at the evolution of video card prices. Now I don't get a shit for $1200.
Same country, lots of cards before these but lets talk proper unified shader\cuda gpus only

9800gt 512mb 2008 for under 200$ new at that time and exchange rate
gts250 1gb
9800gtx 2014 for 30$
gtx760 2015 for 80$
gtx960 2017 for around 100$
gtx280 from a friend for free
a few
rx460's\570's\580 2018-2022 for around 120$
rx570 swap for 1050ti plus lga1155 and 8gb ddr3 2022
gtx1070ti for 130$ 2024

biggest disappointment was after finding out that the 1st 9800gt was showing artefacts because the memory couldn't handle the standard asus factory oc, so instead of sending the card for warranty, I gamed on it underclocked for a whole year only to receive a dead gtx260 as a warranty replacement, Crysis played relatively fine at 30fps max as well as most titles of those days but games started to be boring, never finished more than 3\4 games after gta4... so trying to justify more than 200$ for a video card is hard... even since getting a 4k hdr monitor for proper quality, the 1070ti is a undervolt champ and runs 4k in most older titles or racing sims with peak efficiency, could of got an rtx2060super or so for the same money but as far as I understand its not as good in raster performance on older titles and when it comes to ray tracing... I'm starting to realize that its a sort of gimmick like physx was, so I'm glad that I didn't get on the disappointment bandwagon with the rtx2000\3000 series since even the 12gb 3060 that seems like a reasonable upgrade is not worth it for 10-20% more performance improvement from the 1070ti... the only game I play periodically that doesn't do 4k60 is cod warzone 2 and I don't care since it looks fine at 1080p... Nvidia is profiting from the hipster hype and the ai\mining boom as much as it can since it knows that most gamers mostly buy their high end cards only when they can get them for nothing after a few years....
Posted on Reply
#122
Onasi
@Waldorf
Jessie, what the f**k are you even talking about? Nobody said you can’t prioritize or enjoy high end settings. This whole discussion started when someone said that lowering settings in games “makes them look like games from 10 years ago”. All I did was point out that this is an exaggeration and that, for the most part, running Medium settings will get you decent graphics if that’s what you have to do.
That’s it. There was no crusade against people playing on max settings. You have created a windmill and are currently bravely charging it.
Posted on Reply
#123
Waldorf
@Onasi
so then, what makes your opinion (as in med is fine), more valid than someone saying "only ultra is playing"?
which you are saying by saying "This obsession ... is ridiculous"

doesnt sound very "inclusive" to me.
Posted on Reply
#124
AusWolf
Waldorf@Onasi
so then, what makes your opinion (as in med is fine), more valid than someone saying "only ultra is playing"?
which you are saying by saying "This obsession ... is ridiculous"

doesnt sound very "inclusive" to me.
Why does either opinion have to be more valid than the other? Can't they both be fine? :(

I aim for maximum graphics in every game, the same as you, but I accept that sometimes I have to make compromises with high or medium, which isn't nearly as noticeable and distracting as it was 10-15 years ago.

Edit: There is nothing against playing on ultra graphics by saying that sometimes (most of the times?) medium is fine, too.
Posted on Reply
#125
Onasi
Waldorfso then, what makes your opinion (as in med is fine), more valid than someone saying "only ultra is playing"?
Where have I said that? Do quote me, please.
Waldorfwhich you are saying by saying "This obsession ... is ridiculous"
You have an interesting way of interpreting things. But do keep on fighting windmills.
Posted on Reply
Add your own comment
May 16th, 2024 13:10 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts