• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Provides a Statement on MIA RTX 3090 Ti GPUs

I think they found out no performance uplift from 3090 ha ha joke

But I believe they waiting until 6950 will be released...
 
I still say it's because no Radeon 6950XT showed up. There's no reason for it in this market, especially if they're having issues producing one, which it seems they are.

This has never stopped Nvidia before

They probably realized releasing another video card that cost 3k is stupid. Especially when 40 series and AMDs cards are due out later this year.

This has never stopped Nvidia before

The only thing stopping Nvidia or its cards.... is HEAT, boys. Heat.
They built a boost algorithm that works better than the hardware it's used on. Everytime we see Space Invaders or EVGA producing yet another shite cooler, its a heat problem.

Or, put differently for this current gen, Ampere is shit on Samsung 8nm, as it was initially scaled for TSMC.
Nobody in their right mind willingly chooses to up the TDP on the top end of its stack by nearly a third in one gen - and that's just counting the non ti. Every lower configuration can 'make it' fine, albeit also with inflated TDPs. But the top end... that's new territory. We didn't need Raja after all to get a solid handle on >300W cards in the consumer space did we...

Its a trend with all newer components now. To get more perf, we get more heat, and the margin for error is thin, but hardware is stopped from frying itself by smart algorithms.
 
Coming from a Strix 3090 OC owner the TI variant of the 3090 is absolutely stupid, just look at the spec difference, it's absolutely tiny to the point where I doubt you'll see just a few frames of difference if any.

Just look at the spec difference between the 3080 and 3090, the spec difference between those two is pretty massive on paper but in reality it's what, 10 to 15fps faster in the best cases. For a 3090 Ti you would be paying way more than a 3090, you'd get a card with a much larger power draw so most likely hotter for what, 0fps to 5fps if your very lucky.

It would be a DOA product at launch and ridiculed by reviewers, it would be the worst Ti card to ever be released and nvidia knows it.
 
Last edited:
They built a boost algorithm that works better than the hardware it's used on. Everytime we see Space Invaders or EVGA producing yet another shite cooler, its a heat problem.

Sorry, call me old school but, give me a stable static clock over whatever rubbish this GPU boost system is any day every day... I actually do a custom curve having my card target a ~1100MHz base so it flatlines on the maximum possible overboost bin all the time (which turns out to be the 1755 MHz target I want it to run at). GPU boost is especially awful on cards with a lower power limit, you will never see it behave correctly pretty much ever - this thing loves to throttle. Trust me when I tell you it is a better experience to have the stable, flatlined clocks at a very modest vcore and absolutely no thermal or power throttling whatsoever than just letting this boost algorithm rubbish run wild.
 
Sorry, call me old school but, give me a stable static clock over whatever rubbish this GPU boost system is any day every day... I actually do a custom curve having my card target a ~1100MHz base so it flatlines on the maximum possible overboost bin all the time (which turns out to be the 1755 MHz target I want it to run at). GPU boost is especially awful on cards with a lower power limit, you will never see it behave correctly pretty much ever - this thing loves to throttle. Trust me when I tell you it is a better experience to have the stable, flatlined clocks at a very modest vcore and absolutely no thermal or power throttling whatsoever than just letting this boost algorithm rubbish run wild.
I sort of agree, but sort of don't. I prefer seeing flat clocks myself, though I have to admit that not every scene is the same in any game, so the card may need less or more power to render them. Targeting the TDP level instead of clocks makes sense for cooling, and is better for your PSU as well.
 
I sort of agree, but sort of don't. I prefer seeing flat clocks myself, though I have to admit that not every scene is the same in any game, so the card may need less or more power to render them. Targeting the TDP level instead of clocks makes sense for cooling, and is better for your PSU as well.

I use 3 undervolt profiles:
1725mhz/750mV (~280W)
1830mhz/800mV (~310W)
1920mhz/850mV (~350W)
I just toggle between them in-game to find the optimal FPS/efficiency, 90% of the time I use 1830mhz/800mV though
 
Sorry, call me old school but, give me a stable static clock over whatever rubbish this GPU boost system is any day every day... I actually do a custom curve having my card target a ~1100MHz base so it flatlines on the maximum possible overboost bin all the time (which turns out to be the 1755 MHz target I want it to run at). GPU boost is especially awful on cards with a lower power limit, you will never see it behave correctly pretty much ever - this thing loves to throttle. Trust me when I tell you it is a better experience to have the stable, flatlined clocks at a very modest vcore and absolutely no thermal or power throttling whatsoever than just letting this boost algorithm rubbish run wild.

Meh. As much as I understand the reasoning, I do disagree. A well built boost algorithm is just extra performance and usually where it counts.

It requires a different stance I think towards overclocking or undervolting.
You're no longer setting the exact situation you want, you're setting the limitations you want. Much like @nguyen here above: 3 sets of limitations for voltage and boost will ensure you have the maximum clock within that limitation. The net result is probably an equal amount of control to the old situation, but still a fluctuation of clocks, where the fluctuation is likely to be overclock potential you'd never have had with a flat clock line.

But... it all depends on how refined and well designed the boost algorithm is, and what its stock limits are.
 
Back
Top