• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024

What I would like to see from the RTX 5000 series is more refined efficiency and thus shorter/smaller/thinner cards and lower TDP's(the stacked dies would help there if the rumors are correct), a move back to standard PCIe power connectors, jumping to a VRAM standard or 12GB, 16GB and 24GB/32GB(leaving the 8GB for the lower tiers), more user customizations and controls in the VBIOS and finally, better support for small-form-factor sector of PCs.
 
I really hope the AI bubble bursts before these come out, or we will see "gamer" card prices like we have never seen before! The AI industry hates nGreedia, but until a viable platform alternative comes, we are at nGreedias mercy, something they are not famous for!
 
It's possible that the rumor AMD isn't making a high-end RDNA 4 card is only half true. It's likely that they aren't making a high-end chip (Navi 41), but that doesn't mean they won't combine mid-range Navi 42/43 chips to make a high-end card.

This approach is cheaper for AMD, and can result in high-end performance.

AMD started with chiplet GPUs with RDNA 3, and I think it's only going to continue.
 
It's possible that the rumor AMD isn't making a high-end RDNA 4 card is only half true. It's likely that they aren't making a high-end chip (Navi 41), but that doesn't mean they won't combine mid-range Navi 42/43 chips to make a high-end card.

This approach is cheaper for AMD, and can result in high-end performance.

AMD started with chiplet GPUs with RDNA 3, and I think it's only going to continue.
Wasn't the main point of chiplets to be able to easily scale the design and fill the entire market?
And can you elaborate what you mean by combining mid-range chips?

Does anyone know where that rumor comes from? Because if it doesn't come from a trusted source, we should ignore it.
 
Wasn't the main point of chiplets to be able to easily scale the design and fill the entire market?
And can you elaborate what you mean by combining mid-range chips?

Does anyone know where that rumor comes from? Because if it doesn't come from a trusted source, we should ignore it.
Right now, the high-end RDNA cards are comprised of one Graphics Complex Die (GCD), and multiple Memory Complex Dies (MCD) on the same package.

AMD RDNA 3 Tech Day_Press Deck 19.png


It's possible that with RDNA 4, AMD might combine more than one GCD on the package. My guess is that they would be smaller, mid-range GCDs rather one big, expensive, high-end GCD. This should be cheaper to make, less risky considering yeilds and defects, and at the same time, allow AMD to reach enthusiast class performance.

So just because the rumor is that AMD wont make a high-end RDNA 4 chip/GCD, doesnt mean they wont make a high-end RDNA 4 card.
 
Not going to lie, if I was rich a 2nm Arrow Lake and RTX 5090 would be epic as fuck combo. lol

Considering the price I paid for my current rig though, $200 cpu and $110 mobo, and $705 gpu on sale... meh. I am happy where I am.

If someone hires me with a decent salary next year though I might considering selling my current rig and getting my dream Ultima 5090 and 2nm Arrow Lake combo in Winter 2024.
lol
 
meh

RX6600 for $250CAD (180USD) and turn down graphics settings and resolution.
 
At least, and as I've been saying for a while; this "AI" hype eventually lead to someone designing cheap ASICs for specific purposes, meaning the end for these "AI"-tuned GPUs in the long run.
Nvidia would be foolish to focus their development effort on this market, especially since they soon will have numerous of competitors there.
BTW, wasn't' there recently some news about AMD focusing on this too?
But A.I is already running on specialized hardware, even on Nvidia GPUs. Jensen doesn't see what he's doing right now as "general computing", on the contrary, the guy said that specialized hardware is the future, he calls what they are doing right now accelerated/heterogeneous computing. https://blogs.nvidia.com/blog/what-is-accelerated-computing/

Nvidia tensor cores, AMD/Intel NPU are part of the general package. They don't want to make a dedicated AI extension card. I'm seeing a lot of comparisons being made between A.I and mining, but that's not what happening, mainstreams apps are already accelerated by AI-specialised hardware:
1703700167569.png




See how intel/AMD are not mentioned ? They were late to join the game, so they will be late to get equivalent software support. Same old story. Same reason reason why Nvidia still dominates the market even though they are the least-liked company according to what you see on forums.


Also Keep in mind that the A.I specialized GPU (A100/H100) are configured differently compared to the desktop version. The A100 is based on ampere, but if you compare the GA100 to the GA102 used in the 3090, you see that it's not just a bigger die:
GA100:
1703698519793.png

GA102
1703698543667.png
 
Because I love a toxic relationship I can't wait for the 50 series and a lot of gas-lighting and emotional abuse from Nvidia /s
 
I hope ya'll been buying Nvidia and AMD stock (as well as Micron and Intel)

I've earned so much since 2020, I could buy several of these overpriced monstrosity cards.
 
Without competition from AMD, bend over, grab your ankles and bite the pillow.
 
You just invented RX 7900 XTX. Well, with 1.7x that power consumption.


If we throw ~2000 USD price tag outta the window then yes, it's drastic. Yet 4070 Ti is the closest $ to $ successor, just a tad cheaper than 1080 Ti was (inflation considered). Not that impressive, huh?

A 4070 Ti completly blows a 1080 Ti out of the water in terms of performance and features. The quality of video encoding on the newer nvdia gpus alone is worth the upgrade, let alone all the rtx / dlss features... :laugh:
 
A 4070 Ti completly blows a 1080 Ti out of the water in terms of performance and features.
So does 1080 Ti compared to GTX 580 (same 6 year difference and an even bigger difference in terms of speed and efficiency). It's normal for newer GPUs to be much faster and more efficient than older ones. Why you are so impressed, I don't know.
 
Last edited:
Because I love a toxic relationship I can't wait for the 50 series and a lot of gas-lighting and emotional abuse from Nvidia /s
I hope you think of it as entertainment, because if you do, each product cycle can be quite exciting. (especially if you go back and review the "leakers" in hindsight, then it's obvious who is full of manure.)

But we should expect there to be a lot of BS whenever a new product arrives; cherry-picked benchmarks, new pointless gimmicks (usually some new nonsense AA-technique), etc.
Then there is the overreaction from all the fanboys from the other team.
And then 2-3 years later, people finally acknowledge it was actually a good product. (like with the 1080 Ti, 2000 series, etc.)

I hope ya'll been buying Nvidia and AMD stock (as well as Micron and Intel)

I've earned so much since 2020, I could buy several of these overpriced monstrosity cards.
Remember, you haven't really earned anything until the investment is realized. ;)
If you're smart, you gradually sell these overpriced stocks and move the investment into managed funds with a solid track record, so you can continue growing your wealth for years to come. Don't wait until see where the top of single stocks are, because then it's too late. :)
 
Based on current demand and foundry prices I will not be surprised if 5 series comes in with another massive price hike.
its allways Price/perf ratio.
NEVER price alone!

i dont care naming.
If they release RTX5030 and its 700$ but faster than Rtx4090 then its better price/perf.

So its not Price hike alone, its also performance hike.
Pls.. dont be stupid.
 
They should only release mid-range GPUs for gamers, keep the high-end exclusively for AI and other professional uses.

Just do a 5060 and 5070 for $300 and $500. Two SKUs would not tie up the production lines, they could make tons of AI GPUs for thousands of dollars. If they do a 5090 with 32 GB and 450+ W, it's just stupid. Who needs that?
 
Who cares? No one could afford the 4000 let alone these 5000 GPU's......So sure bring it on! and let us all watch and sigh and dream on.
 
Hi,
Not sure about not affording
Think it's more priority clash plus why give nvidia so much money when there is so much better hobbies that live longer than gpu's lol
Same goes for intel's tomanylakes :laugh:
 
Hi,
Not sure about not affording
Think it's more priority clash plus why give nvidia so much money when there is so much better hobbies that live longer than gpu's lol
Same goes for intel's tomanylakes :laugh:
Well, I couldn't justify spending £1200 to £2200 on the only two 40x0 series cards that make any kind of sense. Even the 4080 is a bad buy because it's so slow in modern games @4k. The gap between it and the 4090 is just too wide, if the 4080 had been 20% faster, I would have pulled the trigger. The 40x0 series is the worst value series nGreedia have ever released. I don't think their shenanigans will stop with the 50x0 series.

I know a lot of people that felt the same, and are holding on for the 50x0 series instead, as hopefully, even the lower range cards will offer some meaningful improvements over the 20x0 and 30x0 series cards everyone still use.
 
I really hope the AI bubble bursts before these come out, or we will see "gamer" card prices like we have never seen before! The AI industry hates nGreedia, but until a viable platform alternative comes, we are at nGreedias mercy, something they are not famous for!
AI is here to stay. I foresee that toilet seats will soon have AI here in Japan...
 
AI is here to stay. I foresee that toilet seats will soon have AI here in Japan...
Hahaha, Oh I'm sure that A.I. is going to continuously be plastered all over all sorts of products that have nothing to do with it!

I was more thinking along the lines of nGreedia's strangle hold on it, and most of the major players have been announcing their own chips instead of using NV chips.
 
Last edited:
Back
Top