• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

1 to 4% faster in raster, 20% slower in RT, higher median power consumption, very hit and miss driver support, lacks access to the Nvidia ecosystem. The 4080 will provide a better gaming experience, I can guarantee you that.

only circumstances I'd consider the 7900XTX is if you can purchase it for $300 less
That's my GPU budget lol

I just turn down settings rather than spending more money lol
 
I really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
Leave it do the fanboy to pull out the term fanboy before anyone even comments and then start gulping down Jensen's meat like you get anything in return from it.

Really looking forward to another disappointing launch, since that has been the rule since 2018.
I really wonder, how will Nvidia surprise us? Entry level at 500€ with 92 bit bus and 65mm^2 die?
 
Leave it do the fanboy to pull out the term fanboy before anyone even comments and then start gulping down Jensen's meat like you get anything in return from it.

Really looking forward to another disappointing launch, since that has been the rule since 2018.
I really wonder, how will Nvidia surprise us? Entry level at 500€ with 92 bit bus and 65mm^2 die?

If AMD does not step up, yes. But I have a feeling that things will be lively at the midrange with RDNA 4 and Battlemage
 
I really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
LMFAO, I'm Nvidia fanboy, and I feel shame of cr*p Nvidia offers with bus width. I'm OK with VRAM tho :D
 
RT still isn’t viable as the performance hit it still to big without DLSS

DLSS is ok but so is FSR

And yea I hear that a lot. Which is funny because I’ve used AMD since the HD 4000 days and haven’t had driver issues since Hawaii. Which quite some time ago.

Yeah right, my crappy 6400 that replaced my 1050 had trash drivers that they never fixed their issues, in fact, the added more issues later on, and before you come here to tell me that I should report those issues, well I did it, and almost a year later that I reported those issues, they stayed there, so after hating that card for their crappy drivers, I ended up with a 4600, now my drivers problems are gone.

So for those of you that said that AMD has no drivers issues, they do, just check their forums on their reddit, nobody is listening either.

Nvidia brand loyalists are fixated on three things:
  • RT
  • DLSS
  • The internet myth that AMD has fundamental driver problems and Nvidia doesn't
Outside of those three things, the GPU market looks very even and competitive with AMD doing slightly better in performance and price as you pointed out. But even if all three of my points above didn't exist, these loyalists would still buy Nvidia. But I appreciate you and everyone else doing what they can to prevent the blind fealty to one company that threatens to ruin our DIY PC building market that we love so much.

Nvidia has less drivers issues than AMD by far, at least Nvidia tries to fix them, AMD seem to not care about it.
 
I don't need 24 GB on a 5070, waste of power. I'll wait for the 18 GB Super refresh with 3 GB modules.

The cut down memory buses compared to Ampere make no sense. 60-class should be 192-bit and 70-class should be 256-bit. That's the main problem with the Ada line-up and it's gonna be the same here.
But I guess Jensen thinks more cache makes up for the lower capacity, not just bandwidth. It's not accurate, but it's correct.
 
Is nobody going to comment on how they incorrectly assumes the core count of GB206? If a TPC has 256 Cores and there are 18 TPCs, shouldn't it be 4608 Cores total? Or am I missing something here?
 
I really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
Tell that to my 8GB brand new card that can't even start the last of us cus of low VRAM. Yeah, the last driver bugged the game, Nvidia already stated this much in their driver info. It used to work fine with 8GB, or well... after the patches... point is, having more VRAM is good. It's not empty crying. It can avoid having issues, even if it does nothing for performance. I still get nightmares from GTA 4 not having enough VRAM... Sometimes games need more. You are on Techpowerup, here you can CLEARLY see that fact from their reviews. Even for 1080p, some games need more. That's only 2024, imagine 2025? 2026? People dont buy a new GPU every year.

I too wish Nvidia would give us more too, not cus people cry, cus we NEEED it.
 
Tell that to my 8GB brand new card that can't even start the last of us cus of low VRAM. Yeah, the last driver bugged the game, Nvidia already stated this much in their driver info. It used to work fine with 8GB, or well... after the patches... point is, having more VRAM is good. It's not empty crying. It can avoid having issues, even if it does nothing for performance. I still get nightmares from GTA 4 not having enough VRAM... Sometimes games need more. You are on Techpowerup, here you can CLEARLY see that fact from their reviews. Even for 1080p, some games need more. That's only 2024, imagine 2025? 2026? People dont buy a new GPU every year.

I too wish Nvidia would give us more too, not cus people cry, cus we NEEED it.
There's still a lot of people out there that are really hung up on the 8GB thing, they think that games today should be just fine on 8GB because games 5 years ago ran fine on it and anything that doesn't is coded wrong by bad devs and nothing to do with the industry moving forward with newer tech.
 
Tell that to my 8GB brand new card that can't even start the last of us cus of low VRAM. Yeah, the last driver bugged the game, Nvidia already stated this much in their driver info. It used to work fine with 8GB, or well... after the patches... point is, having more VRAM is good. It's not empty crying. It can avoid having issues, even if it does nothing for performance. I still get nightmares from GTA 4 not having enough VRAM... Sometimes games need more. You are on Techpowerup, here you can CLEARLY see that fact from their reviews. Even for 1080p, some games need more. That's only 2024, imagine 2025? 2026? People dont buy a new GPU every year.

I too wish Nvidia would give us more too, not cus people cry, cus we NEEED it.

There's still a lot of people out there that are really hung up on the 8GB thing, they think that games today should be just fine on 8GB because games 5 years ago ran fine on it and anything that doesn't is coded wrong by bad devs and nothing to do with the industry moving forward with newer tech.

Limiting the VRAM to only 8 GB (first graphics card ever using this VRAM amount was the AMD Radeon R9 290X back in 2013, 11 years ago) effectively means that nvidia advocates lower resolutions screens, and works against improving the gamers' experience by going to 2160p.

1718568669127.png

 
Last edited:
I hope they are great cards, I have no interest in RT as it is mostly useless for my use case.

Happy 7900 XT user with no care in the world for things that are not even 25% matured or usable for most.

There will be a time things become mainstream, this time is not now, but I am not here to control who does what.

Some people RIP into console users and their experiences of fake resolutions whilst pushing their favoriate upscaler.

Cognitive dissonance and general intellect drop has been seen since the dawn of the RTX lineup.

Yeah right, my crappy 6400 that replaced my 1050 had trash drivers that they never fixed their issues, in fact, the added more issues later on, and before you come here to tell me that I should report those issues, well I did it, and almost a year later that I reported those issues, they stayed there, so after hating that card for their crappy drivers, I ended up with a 4600, now my drivers problems are gone.

So for those of you that said that AMD has no drivers issues, they do, just check their forums on their reddit, nobody is listening either.



Nvidia has less drivers issues than AMD by far, at least Nvidia tries to fix them, AMD seem to not care about it.
Sorry your user experience is not good.

I click play, it works.

pcgamingwiki.com is your friend, a lot of issue have nothing to do with drivers, it's sometimes just Windows messing with stuff, it's why I run Enterprise 24H2 Windows 11.

 
I really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.

amen! It’s so bad we have to pray Intel will overtake AMD since AMD has abandoned the highend market. AMD has nothing capable of competing with 4090, 4080s, 4080, 4070ti S or even 3090/3080.
 
amen! It’s so bad we have to pray Intel will overtake AMD since AMD has abandoned the highend market. AMD has nothing capable of competing with 4090, 4080s, 4080, 4070ti S or even 3090/3080.

What do you mean? AMD has GPUs that compete with everything other than the 4090 but according to the Steam Hardware Survey less than 1% of gamers are using a 4090 anyway so is that really important to compete with?

If you are bringing ray tracing into it then that changes things though but maybe not for next gen AMD.
 
Limiting the VRAM to only 8 GB (first graphics card ever using this VRAM amount was the AMD Radeon R9 290X back in 2013, 11 years ago) effectively means that nvidia advocates lower resolutions screens, and works against improving the gamers' experience by going to 2160p.

View attachment 351604
xx60 users shouldn't expect to play at 4K Ultra with RT (without upscaling), using data implying that they can't due to VRAM limitations before complaining about the frame buffer and promoting 16 GB AMD options, which still lose to the 12 GB NVIDIA options at all resolutions in TPU testing, because they have more VRAM (despite the x600 class also having 8 GB BTW) is irrelevant to the use case and disingenuous. 1080p is by far the most popular resolution, and even 1440p is still quite playable with an 8 GB card, even in Ultra, considering TPU testing.
What do you mean? AMD has GPUs that compete with everything other than the 4090 but according to the Steam Hardware Survey less than 1% of gamers are using a 4090 anyway so is that really important to compete with?

If you are bringing ray tracing into it then that changes things though but maybe not for next gen AMD.
The first AMD discrete GPU in the Steam charts is the RX580 in position #31 with 0.91%, so, less than the 4090, I suppose that gives an idea how well consumers consider they compete. The NVIDIA xx60 class is of course #1.
 
I hope they are great cards, I have no interest in RT as it is mostly useless for my use case.

Happy 7900 XT user with no care in the world for things that are not even 25% matured or usable for most.

There will be a time things become mainstream, this time is not now, but I am not here to control who does what.

Some people RIP into console users and their experiences of fake resolutions whilst pushing their favoriate upscaler.

Cognitive dissonance and general intellect drop has been seen since the dawn of the RTX lineup.


Sorry your user experience is not good.

I click play, it works.

pcgamingwiki.com is your friend, a lot of issue have nothing to do with drivers, it's sometimes just Windows messing with stuff, it's why I run Enterprise 24H2 Windows 11.


I'll check that Enterprise Windows 11, but trust me, AMD has drivers specifically AMD fault not windows or apps fault.
 
Limiting the VRAM to only 8 GB (first graphics card ever using this VRAM amount was the AMD Radeon R9 290X back in 2013, 11 years ago) effectively means that nvidia advocates lower resolutions screens, and works against improving the gamers' experience by going to 2160p.

View attachment 351604

Slight correction on this, the 290X 8GB launched in November 2014, only the 4GB version launched in 2013. While the 390X was the first desktop GPU with 8GB, the first 8GB GPU was the 880M launched in March 2014, followed by the 980M in October 2014. Then in March 2015 Nvidia went to 12GB with the Titan X. You should also remember that AMD released 2GB 32bit GPU in 2022, it may have been OEM only but they still launched it, along with a 64bit 4GB Laptop salvage part in the 6400/6500XT.

I'm not saying 8GB today is good (and Nvidia certainly overprice it by $100 or more), but you have to admit both sides have their issues with the amount of VRAM they use BUT also slam the Memory IC makers who could have made 4GB chips for GDDR5X, GDDR6 & GDDR6X (plus 3GB for the latter 2) but instead stuck with 1GB or 2GB chips. With GDDR7 they have dropped the 1GB but 3GB is not coming until a year or more after the 2GB ones.

I can certainly see the 5050, 5060, 5060 Ti (maybe) being by far the worst choice compared to AMD & Battlemage in the sub $500 market, if only by the VRAM limiting their usefulness. I certainly wouldn't recommend a 8GB card for anything but eSports or casual light gaming and for under $200, preferably $150 or less.
 
Back
Top