• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector

Wow, amazing. Running two different displays, one of them is playing a recording which is clamped to sRGB color space. Can't believe they actually look different!

View attachment 355327
Nah, the explanation is that nvidia cards can't display colors. You heard it here first

Adrenalin has an option to make the colors super saturated, you just click and it enables (or is on by default, not sure) which can explain any differences. I have my 6900hs laptop and my 4090 desktop connected to the same screen and colors are identical

Luckily RT performance is still not needed as much as pure raster performance.
Personal preference. If you wanna buy a card and keep it long amd is better because 2-3 years down the line RT won't be playable on your old card even if it's an nvidia one. If you upgrade every gen nvidia is clearly the better option though.
 
Have no idea why people care about stock power draw
Personally I dont care much for it, but as I mentioned in another thread, people dont care about it until they can use it against AMD.
No, not because of power consumption, because of efficiency. These 2 are different. Put a 7900xtx and a 4080 both at 250w and see which one is faster. That's efficiency. Power draw is irrelevant, a card can draw 2 kilowatts and its fine, if you can limit it to 300w and have it still be fast, no issue.
There is a point there, but conveniently, you left out why in this particular case, theres a difference and its down to the foundry node used.
Slightly more than that:
Hence why I said "Properly".
I think what he meant is there are probably 5 games where RT actually meaningfully improves the visuals....
Correct, thank you.
Having 5 games with meaningful RT, that's a LOT.
Currently, perhaps that will grow, but for now, stands, thanks.
I do expect more and more games over time to get amazing RT
Same here but so far, 99% of the current ones rely on standing on puddles or mirrors and I personally dont find that it adds much to the gameplay, especially to justify the insane performance hit.

anyone who thinks otherwise is delusional.
I am not entirely convinced that we will ever get to that point but I could be wrong, We are now 5 years or so into the RT hype and the results are few and are still "dependent" on GPU's that goes for over US$1500. By the way, those first couple of gens RTX GPU's cant do much on the RT dept already.
ton of games that the performance hit doesn't justify the visual upgrade.
Thats my main problem with this. I love the game The Ascent and enabling RT drops the fps from 250+ to 50 or so on my XTX and the only place that I can see it are on puddles, which honestly dont add anything to the gameplay.
I have other games in my backlog that might have better implementations but as said, they wont add much if anything to the gameplay.
IMO, what we need is better surfaces on humanoids and other living things, especially in the rain, and better animations. Map detail and lighting are awesome, but humans still look and act like porcelain dolls.
Agreed and I think is worth adding, good gameplay but that is not the problem of the GPU. :)
 
Personally I dont care much for it, but as I mentioned in another thread, people dont care about it until they can use it against AMD.
I've explained it to you twice now but you still don't get it. Amds problem isn't power draw, it's efficiency. You can limit the power draw on an amd card just like you can on an nvidia card, but it will still draw more power for the same performance.

There is a point there, but conveniently, you left out why in this particular case, theres a difference and its down to the foundry node used.
Who cares about the why? The end result is what matters.
 
Maybe you both are trolls, no?

Which product is slow?

There must be a global professional investigation against nvidia for cheating, using dlss as the default setting, making slow GPUs appear good on the charts.

Let's compare the slow RX 7600 with the even slower RTX 4060, which is a junk, leftover byproduct, rebadge of something GT **30 class, or *50 LE class.

RTX 4060 vs RX 7600:

View attachment 355316vs View attachment 355317
Theoretical specs are meaningless. Modern raytraced games run like dogshit on AMD and you know it.

Only uses software lumen on PC/Console I believe there is no way the reflections on water would look as bad as they do if it was hardware lumen otherwise the game looks fantastic and other than resolution even looks great on console.

Honestly it looks much better than a lot of games with RT......th
Ah, thanks for the correction!

D4?! Lmao. Hellblade doesnt need it either... or ghostwire.

I see we have journeyed into green vs redu and RT fantasy land... I guess thats all this thread's gonna ginow.
I’ll just assume you haven’t played them with RT. Because you know developers have all this spare time to go back to older releases to add features that provide no benefit, right?
 
I've explained it to you twice now but you still don't get it
Same for you. I am clearly talking about the usage. Playing the words game at this point.
Amds problem isn't power draw, it's efficiency. You can limit the power draw on an amd card just like you can on an nvidia card, but it will still draw more power for the same performance.
See? Thats what I’m talking about, it doesnt matter until it does.

Who cares about the why? The end result is what matters.
See above.
Looking at my XTX during gaming, i see it reporting using around 350W without undervolting.
Per reviews, the 4080 is at around 305W for less performance. So I dont see an issue there neither care enough to be hounding someone on a forum for it.
But as i said, when convenient, power consumption/efficiency is indeed an issue. :peace:
 
Personal preference. If you wanna buy a card and keep it long amd is better because 2-3 years down the line RT won't be playable on your old card even if it's an nvidia one. If you upgrade every gen nvidia is clearly the better option though.
Only if you value RT in current games enough to pay the extra.

Although, upgrading with every generation is becoming increasingly foolish, but that's another matter.
 
Same for you. I am clearly talking about the usage. Playing the words game at this point.

See? Thats what I’m talking about, it doesnt matter until it does.


See above.
Looking at my XTX during gaming, i see it reporting using around 350W without undervolting.
Per reviews, the 4080 is at around 305W for less performance. So I dont see an issue there neither care enough to be hounding someone on a forum for it.
But as i said, when convenient, power consumption/efficiency is indeed an issue. :peace:
You can't compare power draw on reviews with your pc. If you had a 4080 on your system maybe it would draw 250w with your use case. Also it's not really slower but that's irrelevant
 
I think what he meant is there are probably 5 games where RT actually meaningfully improves the visuals....

For me it's

Witcher 3 NG
Cyberpunk 2077
Alan Wake 2
Ratchet and Clank
Control
Metro Exodus EE
Spiderman
Quake 2 RTX
Portal RTX
Minecraft (haven't played this but it does look way better with RT)

Most games do it pretty terribly though RE 4 Remake is absolutely trash when it comes to RT those F1 games and countless others I'd say probably 1 in 10 games that has RT does it well and that might be generous....
Serious Sam 1st & 2nd Encounter
DOOM 1 & 2
Quake 1
Half-Life 1
 
500W + Transient Peaks, over the 12V-2x6? :roll:

Looks like Intel is setting precedent:
It's okay to engineer products that will fail inside warranty.
Meh. 3090ti peaked at 625 watt and those didnt fail constantly.

Just pulling power isnt an issue.

The important question is, for how many games do you upgrade your GPU? Cause 5 is a lot. I mean I've played 10-15 games with my 4090 but half of them worked fine with my old card, I upgraded my GPU just for a couple of them. Having 5 games with meaningful RT, that's a LOT.
I usually wait until there is a game I cant play to upgrade. Last time it was Halo Infinite Multiplayer.
 
Theoretical specs are meaningless. Modern raytraced games run like dogshit on AMD and you know it.


Ah, thanks for the correction!


I’ll just assume you haven’t played them with RT. Because you know developers have all this spare time to go back to older releases to add features that provide no benefit, right?
I did. Its unimpressive.
 
I’ll just assume you haven’t played them with RT. Because you know developers have all this spare time to go back to older releases to add features that provide no benefit, right?
Do you think everything companies do these days has meaning besides completing a pointless tick box exercise? You'd be surprised.
 
If these are correct, it could mean the 5060 will be based on the GB205 like the 5070, and both cards will have 12 GB of VRAM.

That would be great news for the 5060, but not for the 5070. 12 GB on a card that will probably cost $600+ is not acceptable in 2025.
 
If these are correct, it could mean the 5060 will be based on the GB205 like the 5070, and both cards will have 12 GB of VRAM.

That would be great news for the 5060, but not for the 5070. 12 GB on a card that will probably cost $600+ is not acceptable in 2025.

GDDR7 does apparently come in weird configurations compared to GDDR6X so who knows what skews we will get won't be surprised if the 5060 get's gut down to 10GB so Nvidia can save a couple bux on BOM.
 
Still waiting on the Seasonic gen 2 12V-2x6 cable.
 
Since when? Have you asked AMD? And how many Radeons and intel Arcs exactly use this new low quality power connector?
Hasn't Asrock been so far the only AIB from camp AMD to use this stupid connector?
 
Hasn't Asrock been so far the only AIB from camp AMD to use this stupid connector?
AFAIK, yes.

Considering that nVidia wants 500+W pulled thru it, I'd imagine the 'blower style' 7900 (power-limited w/in constraints of cooling) is *not* going to be a problem.

Regardless, I'm not happy about the connector 'gaining any traction' in the market.
 
Goto mighty, no, count da money. I got mine! $$$$. Me waiting for a new NV GPU that can match the 57" Neo G9. :pimp: It's good to be the King!
 
Back
Top