Thursday, September 26th 2024
NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation
Thanks to the renowned NVIDIA hardware leaker kopite7Kimi on X, we are getting information about the final versions of NVIDIA's first upcoming wave of GeForce RTX 50 series "Blackwell" graphics cards. The two leaked GPUs are the GeForce RTX 5090 and RTX 5080, which now feature a more significant gap between xx80 and xx90 SKUs. For starters, we have the highest-end GeForce RTX 5090. NVIDIA has decided to use the GB202-300-A1 die and enabled 21,760 FP32 CUDA cores on this top-end model. Accompanying the massive 170 SM GPU configuration, the RTX 5090 has 32 GB of GDDR7 memory on a 512-bit bus, with each GDDR7 die running at 28 Gbps. This translates to 1,568 GB/s memory bandwidth. All of this is confined to a 600 W TGP.
When it comes to the GeForce RTX 5080, NVIDIA has decided to further separate its xx80 and xx90 SKUs. The RTX 5080 has 10,752 FP32 CUDA cores paired with 16 GB of GDDR7 memory on a 256-bit bus. With GDDR7 running at 28 Gbps, the memory bandwidth is also halved at 784 GB/s. This SKU uses a GB203-400-A1 die, which is designed to run within a 400 W TGP power envelope. For reference, the RTX 4090 has 68% more CUDA cores than the RTX 4080. The rumored RTX 5090 has around 102% more CUDA cores than the rumored RTX 5080, which means that NVIDIA is separating its top SKUs even more. We are curious to see at what price point NVIDIA places its upcoming GPUs so that we can compare generational updates and the difference between xx80 and xx90 models and their widened gaps.
Sources:
kopite7kimi (RTX 5090), kopite7kimi (RTX 5080)
When it comes to the GeForce RTX 5080, NVIDIA has decided to further separate its xx80 and xx90 SKUs. The RTX 5080 has 10,752 FP32 CUDA cores paired with 16 GB of GDDR7 memory on a 256-bit bus. With GDDR7 running at 28 Gbps, the memory bandwidth is also halved at 784 GB/s. This SKU uses a GB203-400-A1 die, which is designed to run within a 400 W TGP power envelope. For reference, the RTX 4090 has 68% more CUDA cores than the RTX 4080. The rumored RTX 5090 has around 102% more CUDA cores than the rumored RTX 5080, which means that NVIDIA is separating its top SKUs even more. We are curious to see at what price point NVIDIA places its upcoming GPUs so that we can compare generational updates and the difference between xx80 and xx90 models and their widened gaps.
181 Comments on NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation
I hope I'm wrong, but we might be underestimating how much Nvidia doesn't need Gaming any more.
If you want sane prices you must boycott the next generation of Nvidia GPU's, otherwise we are headed into a future where GPU's are a luxury only for the super rich and the rest of us are F'ed over.
Regarding Credit cards, I don't own one, but I can split the payment if I want with a Debit one. Which I don't... Perfect for your 1024x600 resolution, using DLSS Ultra performance!!.
Hey, pull out that CRT monitor, and you'd have the best game play experience due to 0ms display lag.
Win-Win :)
Power supply is less susceptoble for sagging. And overall efficiency is higher cos a lot less energy is wasted in power rail.
You dont have to isolate anything if only one rail is connected to supply power section of GPU.
AMD still need to :
- Match CUDA. ROCm was a good step forward but it's got a long way to match CUDA's marketshare in the professional and enterprise space (where it's likely close to 100% marketshare)
- improve their encoder's performance and quality to match Nvidia.
- improve it's upscaling to match DLSS
- improve it's VR support
- ensure AI performance is good and that implementations in existing software is optimized. Currently AMD cards get about 60% of the inference performance of Nvidia cards when comparing an AMD and Nvidia card of equal raster performance.
This would just be to catch up, I still don't think they could charge Nvidia prices if they did all the above because in order to do that they have to start marking innovative features of their own which frankly has not happened under Lisa Sue. This also assume Nvidia doesn't announce any new software features with the 5000 series. Again AMD may fall further behind.
AMD really needs to compete on pricing until it fixes all the above. AMD likely retreated from the high end to focus on enterprise because the above barriers don't matter. AMD doesn't have to worry much about CUDA there because a enterprise customer looking at buying thousands of GPUs is likely tailoring it's software solution to the hardware.
Hell, even the argument that the dGPU prices are driving the masses to consoles is questionable. From what I've heard the PS5 Pro is $700 and the next gen consoles may even be more. Gaming is getting more expensive.
It's something that's creeping up in the tech industry (you can rent your phone, console, or gaming PC) as well, but as you said, at least you get something that actually better (most of the time)
edit
In long term timeframe current newest titles will be cheaper to play too but folks are not patient enough to waitining 6-8 years more for future hardware generations.
If they don't do it for now let's hope AMD will have a 9900 XTX that will compe with the RTX 5090/Ti for competition sake! The perks of buying (even if more expensive) is that you own it and are free to sell it to buy a new one whereas leasing you always pay for something you'll never own and will never have some money back...
670 was 256-bit, 2GB or 4GB
770 was a 680 rebrand
970 was "256-bit", "4GB"
1070 was 256-bit, 8GB
2070 was 256-bit, 8GB
3070 was 256-bit, 8GB
GTX 470 was 320-bit true high end gpu the same as GTX 260 448-bit gpu.
Even stupid RTX 4070Ti for 800$ has GTX 1060 (250$) specs.... People are buying them like a hot cakes. :kookoo: When it's 500$ gpu at the best.
x50 have pretty much disappeared, Nvidia is not even displaying the Desktop RTX 4050 in their RTX 40 lineup on their website lol
x60 GPUs the new Low-end
x70 GPUs are the new Mainstream
x80 are the new x70 & 70 Ti aka High-End
x90 are the new x80 Ti or Enthusiast
x90 Ti is pretty much a TITAN without the 2x VRAM increase Every 2 years he needs a new one because the old one has been improved by A.I. so he needs the new version! He's betting a lot on Blackleather!
Will 1000W be enough for the 5090 and a z890 with 15900k?
Though they did the same milking with Titan X (Pascal) and Titan Xp. How fortunate that Seasonic just released a new 2200W unit. :rolleyes:
Which will most likely be....