Sunday, December 24th 2023
NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024
NVIDIA's next-generation GeForce RTX 50-series "Blackwell" gaming GPUs are on course to debut toward the end of 2024, with a Moore's Law is Dead report pinning the launch to Q4-2024. This is an easy to predict timeline, as every GeForce RTX generation tends to have 2 years of market presence, with the RTX 40-series "Ada" having debuted in Q4-2022 (October 2022), and the RTX 30-series "Ampere" in late-Q3 2020 (September 2020).
NVIDIA's roadmap for 2024 sees a Q1 debut of the RTX 40-series SUPER, with three high-end SKUs refreshing the upper half of the RTX 40-series. The MLID report goes on to speculate that the generational performance uplift of "Blackwell" over "Ada" will be smaller still, than that of "Ada" over "Ampere." With AI HPC GPUs outselling gaming GPUs by 5:1 in terms of revenues, and AMD rumored to be retreating from the enthusiast segment for its next-gen RDNA4, we get to see why this is the case.
Source:
Moore's Law is Dead (YouTube)
NVIDIA's roadmap for 2024 sees a Q1 debut of the RTX 40-series SUPER, with three high-end SKUs refreshing the upper half of the RTX 40-series. The MLID report goes on to speculate that the generational performance uplift of "Blackwell" over "Ada" will be smaller still, than that of "Ada" over "Ampere." With AI HPC GPUs outselling gaming GPUs by 5:1 in terms of revenues, and AMD rumored to be retreating from the enthusiast segment for its next-gen RDNA4, we get to see why this is the case.
126 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024
So still not worth any penny over 800$, and that's me being generous.
(I'm borrowing these images from the latest GPU review. Please ignore the Asrock 7900xt.)
Cyberpunk is one of the more demanding ray tracing games and it favors Nvidia. I do agree the 7900xtx doesn't have playable FPS in Cyberpunk, but only the 4090 does. AMD is just one generation behind. AMD could easily have 4090 level performance in ray tracing next generation.
i dont buy cards based on what anyone says "it is" for/should do.
i buy the "best" i can get, for the funds i have, usually avoiding the top model, and rather upgrade every gen,
as long as someone buys my existing card, at least thats what i did for the past 22y.
ignoring the fact that i dont need a 3090 or even 4090 to play MY games, all run fine at 2160/60p "G synced".
and even if i would get a new AA title, nothing prevents me from running it at 1440p, my screen still stays at 2160p...
and value is in the eye of the beholder:
would you buy your "dream" car/bike/house (whatever it is you want, thats "out of reach")
that would cost 2-3 times of your yearly income? probably not.
would you buy it if you had 10 million in your account? almost 100% sure you would, even that the price didnt change.
(its one thing to say "not for me", but wont mean i will "sour" it for others, just because i cant/wont spend the money.
e.g. my friend had not upgraded since the 2080Ti, has nice income from working his ass off, if he doesnt take care of 3(5) kids,
and doesnt spend money on anything else, so he shouldnt get a xx90 because its bad value?)
to be clear:
nothing i said on TPU ever, means im ok with gpu/Nv prices/pricing.
As with any product out in the world there will be low-, mid-, and high-end offerings. People who are satisfied with low performance are completely free to ignore the high-end stuff but that doesn’t mean those of us with higher incomes and higher standards shouldn’t have the option to buy Lamborghinis while the rest of the peasants putter around in Honda Fits.
My limit for a GPU is around £500, and I'm mostly happy with the 7800 XT I got for that price. If you want more, then buy more. If you want less, then buy less. All I'd recommend is reading reviews, not sitting on any bandwagon and making an informed decision before you buy.
I didnt get a 50in UHD screen, to play minecraft @480p. ;)
one of the main reasons i play on the pc is to have "higher/more" res/options (hw) than a console would offer (beside mouse+kb),
and while it might not be visible in every game and all the time, i prefer to run (smartly) maxed graphical settings,
as even in Siege im "static" long enough (PvE defend), to see the difference.
i rather lower res to 1440p, which will also allow for much bigger gains on fps, than lower graphical detail/settings.
while i assume that post was sarcasm mode: on,
i prefer to argue with folks on the "ultra or nothing" side, as they at least aim for something (as in "above" console (graphics),
rather than the "medium is fine", which isnt really why i (and most here) spend money on "better" hw, than what a console could do.
there are enough sites doing reviews where the difference between min-max is shown,
usually incl what perf hit it comes with.
so if turning down shadows (from Ultra) gives me enough fps while i only lose some shadow detail, fine,
but when it comes to textures/tessellation etc, i dont care for less than high/max, or i dont need a (gaming) pc..
That’s all fine and dandy, and I don’t claim to represent anyones opinions but my own. But I do understand that not everyone upgrades often. Not everyone buys the most powerful hardware available. Everyone has different needs and everyone has different financial situations. Not everyone lives in first world countries and has access, much less the means, to top tier hardware at good prices on demand. Not everyone plays or is interested in latest AAA either. I strongly believe that the strongest attribute of PC gaming is options and scalability. There is something for everyone. THAT what makes it superior to consoles. Not the “better graphics” angle. And I truly believe that graphical elitism and a push for Ultra/Maximum/Epic/Unreal/Nightmare/EldritchAbomination settings has poisoned the well around the hobby and potentially turns off some people. I don’t want that. I want the enthusiast community to be as welcoming and as open as it can be.
What sucks is that a $600 GPU barely gets you 1440p60 (which is derived from how poorly console games run), and that's not even on max settings usually. It's sad when you buy a new card, at what used to be high-end prices not long ago, and you have to compromise quality right away.
But I think the situation will stabilize when the next generation of GPUs comes out. We had the jump to "next-gen" games last year, so progress should slow down for a few years.
the same way you are fine with med settings or prioritizing scaling etc, i should be able to say i want "max/ultra",
without getting funny comments/being required to "defend" this.
to make sure: i never cared what anyone in the game/hardware industry said about "settings", or what i need/have to buy.
and thats still ignoring that most games will look and run better with "high" graphical settings but (one step) lower resolution,
vs native res and med settings, as resolution is the biggest fps eater.
e.g. siege in 1440p looks better on my 50in UHD running maxed settings,
than it ever did with med settings on the 32in (QHD) moni i had before.
@THU31
not saying im fine with Nv pricing/structuring, but my LC 2080S was close to 1000$ (msrp),
and its not anywhere close to the perf of what i can buy now, for half.
short of a bad console port, its also (very) depending on the games you play
so anyone with AAA titles from +4y ago, would easily be able to do 4K,
on the same card you barely get 1440p...
@AusWolf
the same way the rumors for amd can be wrong, they could be (good) for Nv (we still can hope).
but until we see stuff on the shelfs with a price tag, one (unreleased product) isnt "better" than the other.
GTX 1060
RTX 2060
RTX 3070 Ti
In the plan: nothing until 2025.
The reason: the high price paid for the RTX 3070 Ti (2021, the year of mining madness)
My biggest disappointment was the 2600 XT AGP, bought with more than an average economy salary in the very year when Romania joined the EU. Crysis had just been released and I saw with amazement that the video card has problems even with medium settings in 1080p.
For today, that video card is the equivalent of >$1200 compared to the average salary now. That's why we, from the East, look somewhat differently at the evolution of video card prices. Now I don't get a shit for $1200.
9800gt 512mb 2008 for under 200$ new at that time and exchange rate
gts250 1gb
9800gtx 2014 for 30$
gtx760 2015 for 80$
gtx960 2017 for around 100$
gtx280 from a friend for free
a few
rx460's\570's\580 2018-2022 for around 120$
rx570 swap for 1050ti plus lga1155 and 8gb ddr3 2022
gtx1070ti for 130$ 2024
biggest disappointment was after finding out that the 1st 9800gt was showing artefacts because the memory couldn't handle the standard asus factory oc, so instead of sending the card for warranty, I gamed on it underclocked for a whole year only to receive a dead gtx260 as a warranty replacement, Crysis played relatively fine at 30fps max as well as most titles of those days but games started to be boring, never finished more than 3\4 games after gta4... so trying to justify more than 200$ for a video card is hard... even since getting a 4k hdr monitor for proper quality, the 1070ti is a undervolt champ and runs 4k in most older titles or racing sims with peak efficiency, could of got an rtx2060super or so for the same money but as far as I understand its not as good in raster performance on older titles and when it comes to ray tracing... I'm starting to realize that its a sort of gimmick like physx was, so I'm glad that I didn't get on the disappointment bandwagon with the rtx2000\3000 series since even the 12gb 3060 that seems like a reasonable upgrade is not worth it for 10-20% more performance improvement from the 1070ti... the only game I play periodically that doesn't do 4k60 is cod warzone 2 and I don't care since it looks fine at 1080p... Nvidia is profiting from the hipster hype and the ai\mining boom as much as it can since it knows that most gamers mostly buy their high end cards only when they can get them for nothing after a few years....
Jessie, what the f**k are you even talking about? Nobody said you can’t prioritize or enjoy high end settings. This whole discussion started when someone said that lowering settings in games “makes them look like games from 10 years ago”. All I did was point out that this is an exaggeration and that, for the most part, running Medium settings will get you decent graphics if that’s what you have to do.
That’s it. There was no crusade against people playing on max settings. You have created a windmill and are currently bravely charging it.
so then, what makes your opinion (as in med is fine), more valid than someone saying "only ultra is playing"?
which you are saying by saying "This obsession ... is ridiculous"
doesnt sound very "inclusive" to me.
I aim for maximum graphics in every game, the same as you, but I accept that sometimes I have to make compromises with high or medium, which isn't nearly as noticeable and distracting as it was 10-15 years ago.
Edit: There is nothing against playing on ultra graphics by saying that sometimes (most of the times?) medium is fine, too.