• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

There are published specifications for power.

I know but it is quite a difference if it is told in the keynote or just on the website. I doubt many people go to the website to check it out these days. Most people watch videos and don't read (at all^^) tech sheets anymore.
 
I know but it is quite a difference if it is told in the keynote or just on the website. I doubt many people go to the website to check it out these days. Most people watch videos and don't read (at all^^) tech sheets anymore.

Massive power consumption is likely not something NVIDIA prefers to highlight in a limited duration event. They can't touch on every single feature and data point so they need to be thoughtful about what to address and what to leave out.

Do some people want to know the thicknesses of the thermal pads on the ROG Strix cooler? I'm sure there are a few people here who would love some information.

I know there are a lot of lazy people on the Internet, many more now than there were 10-15 years ago when people actually visited corporate websites and knew what the acronyms RTFM and STFW meant. And not just here at TPU. Everywhere online about pretty much every topic.

And if Jensen stepped on the stage at CES in Las Vegas twenty years ago, he still wouldn't have time to go over all of the details on power requirements.

Should I throw a tantrum because Mercedes-Benz didn't mention how heavy the E-Class seatbelt buckles were in their 15 second television commercial? Should TPU be post the entire contents of the ASUS website for a given product + exploded view schematics? Should a software review on DaVinci Resolve also include the 1000+ page manual?
 
Nice. The price is about what I expected. Looking forward to that sweet 4090 goodness. I may upgrade to triple 4k oled for the sim rig.
 
RTX 3080 MSRP: $700(EXPENSIVE) - RTX 4080 MSRP $900 .........................WTF?
 
price usd/performance 1080p
6600(260/109≈2.4)
6800xt(600/203≈3)
3070ti(600/163≈3.7)
4090 (1600/422≈3.8)
3090ti(1100/211≈5.2)

Based on previous reviews, assuming the nvidia 4090 is indeed twice the performance of the 3090ti and average, based on the msrp of the 4090, based on the approximate price I've found now. Performance from youtuber Hardware Unboxed.

The price/performance ratio of the 4090 is indeed better than the 3090ti, but the price/performance ratio of the 3090ti itself is terrible.

And that doesn't include continued price cuts for old graphics cards, new graphics cards costing more than msrp.
 
It really depends on how AMD wants to manage its sales. It has now a golden chance to dismantle nvidia completely and make its lineup DOA.
If I were AMD, I would launch an RX 7600 XT with "4080-12" performance for $649 and call it a day.
4090 will sell good, special peer group. 4080 price will drop off later, depending on AMDs real performance/pricing. 10-15% higher prices nVidia versus AMD will continue (if only rasterization is compared) for various reasons.
 
I will wait for that 175W 4xxx GPU`s but nice to see the dramatic uplift of performance in the same W (100-300W range).

Ada`s (TSMC) 4nm looks like order of magnitude better than ampere`s 8nm (Samsung).
That will allow NV cards to stretch its legs with comfort, to whoever willing to pay the temperature and wattage bill.
 
If you think that 4080 12GB for $900 with slower rasterisation than 3090/3090ti and crapy 192bit-500gb/s bandwith is stupid, just wait for the "real" 4070 with same die but less cores 48 of 60, on par with 3080 for just $700.
 
It seems to me that it's 7800 that will try to be competitive with 4080 12GB, 7800XT with 4080 16 GB, and 7700XT with 4070.

If this happens, it would be a complete nightmare and a disaster/epic fail for AMD.
Really, AD104 should be called RTX 4060 Ti, not "RTX 4080-12".

AD104: 7680 shaders / 500 GB/s bus over 192-bit
GA102: 8704 shaders / 760 GB/s bus over 320-bit

:confused::kookoo:
 
oh the wait for 4k series pricing. it well be cheaper......
age poorly.
 
I guess there's no 4080 12 GB FE, only AIB cards. It really should have been called the 4070. I also guess it would have looked really bad to release a 285 W x70-series card, that's why the two vastly different 4080s. I'm curious what the x60 and x50 tier will look like, and how the 4070 will be positioned if there's gonna be one.

I still hold my position - not impressed.
Yep, that would definitely not have looked good. I wonder if calling the 16GB the 4080 Ti would have been better, though that would have left them without an option for a later release of a more fully enabled chip (4080 Ti Super? Ugh.).

This to me looks like Nvidia is adding even more SKUs than previously, and trying to drive up ASPs across each category. Which IMO doesn't bode well for lower end SKUs and value/performance increases. I'd be happy if we see anything more than the 1:1 perf/price increase of the previous generation, but I'm not very hopeful.
 
Yep, that would definitely not have looked good. I wonder if calling the 16GB the 4080 Ti would have been better, though that would have left them without an option for a later release of a more fully enabled chip (4080 Ti Super? Ugh.).

This to me looks like Nvidia is adding even more SKUs than previously, and trying to drive up ASPs across each category. Which IMO doesn't bode well for lower end SKUs and value/performance increases. I'd be happy if we see anything more than the 1:1 perf/price increase of the previous generation, but I'm not very hopeful.

How do you explain the performance gap between 4090 and "4080-16" - 16,384 vs 9,728 shaders?
4090 has 68% more shaders?

This is not normal.

Look:

RTX 3090 Ti - 10,752
RTX 3090 - 10,496
RTX 3080 Ti - 10,240
RTX 3080 - 8,704

3090 Ti has only 23.5% more shaders than 3080.
And 3090 Ti has 2% more shaders than 3090.
 
Last edited:
Hopefully we could have DLSS-3 without RT (RT off) so a 4050/4060 will be just fine for a 4K, 60FPS, not max ultra settings.

How do you explain the performance gap between 4090 and "4080-16" - 16,384 vs 9,728 shaders?
4090 has 68% more shaders?

This is not normal.

Look:

RTX 3090 Ti - 10,752
RTX 3090 - 10,496
RTX 3080 Ti - 10,240
RTX 3080 - 8,704

3090 Ti has only 23.5% more shaders than 3080.
And 3090 Ti has 2% more shaders than 3090.

4080 12gb
4080 12gb ti
4080 12gb super
4080 12gb super ti

4080 16gb
4080 16gb ti
4080 16gb super
4080 16gb super ti

4090 20gb/ti/super/super ti

4090 24gb ti/super/ super ti

and you can have every bit of +/-256 shaders you like
:)
 
Last edited:
I forgot who said it here but they obviously anticipate having to cut pricing at some point. Start higher so you can go lower in a year. Then release the TIs to fill the gaping performance holes they've purposely left open so once again, there's that, I'm getting a deal feel.

The above moves forward by 6 months if AMD throws a wrench in the works. Throw away Lisa Su, throw away. We need you to increase that market share girl.
 
How do you explain the performance gap between 4090 and "4080-16" - 16,384 vs 9,728 shaders?
4090 has 68% more shaders?

This is not normal.

Look:

RTX 3090 Ti - 10,752
RTX 3090 - 10,496
RTX 3080 Ti - 10,240
RTX 3080 - 8,704

3090 Ti has only 23.5% more shaders than 3080.
And 3090 Ti has 2% more shaders than 3090.
It absolutely isn't. It just establishes the 90 tier as a hyper-exclusive, pie-in-the-sky tier for the ultra wealthy - but at the same time the 80 tier is following its pricing closely, which makes even less sense. This also makes me wonder why the performance differences in Nvidia's slides (not that there was much in terms of performance numbers) were relatively modest.

I gotta say, I don't have much hope for accessibly priced GPUs from Nvidia. With a $1200 80 tier GPU, I wouldn't be surprised at all if the 4000 series gets a $499 or higher 60 tier card.
 
  • Sad
Reactions: ARF
It absolutely isn't. It just establishes the 90 tier as a hyper-exclusive, pie-in-the-sky tier for the ultra wealthy - but at the same time the 80 tier is following its pricing closely, which makes even less sense. This also makes me wonder why the performance differences in Nvidia's slides (not that there was much in terms of performance numbers) were relatively modest.

I gotta say, I don't have much hope for accessibly priced GPUs from Nvidia. With a $1200 80 tier GPU, I wouldn't be surprised at all if the 4000 series gets a $499 or higher 60 tier card.

Well, Resident evil village shows some 70-80% higher performance of 4090 over the "4080-12".

1663747093720.png
 
It absolutely isn't. It just establishes the 90 tier as a hyper-exclusive, pie-in-the-sky tier for the ultra wealthy - but at the same time the 80 tier is following its pricing closely, which makes even less sense. This also makes me wonder why the performance differences in Nvidia's slides (not that there was much in terms of performance numbers) were relatively modest.

I gotta say, I don't have much hope for accessibly priced GPUs from Nvidia. With a $1200 80 tier GPU, I wouldn't be surprised at all if the 4000 series gets a $499 or higher 60 tier card.
erased, I made a mistake.
 
Last edited:
Well, Resident evil village shows some 70-80% higher performance of 4090 over the "4080-12".

View attachment 262465
Aren't all those comparisons with RT and DLSS enabled? I'm not all that interested in edge cases like that, I want something more representative.
Remember the "starting at 329$" for the 4060 in the slide, imply that we will see the same 4080
("starting at 899$) 12/16gb trick in lower tiers.
You mean 3060? There's no mention of any 40-series SKU below the 4080 12GB. And that's just the MSRP for that card, it's been that since day 1. Nvidia is just for some reason promoting that a year and a half after its launch you might actually have a chance at finding one close to MSRP (rather than any kind of price drop, which would be the logical thing to expect that long after launch.

Which just supports my point: if Nvidia is saying "hey, the 3060 is great value at $329" in mid-to-late 2022, then I wouldn't be surprised at all to see them launch a 4060 Ti at $499 or higher in early 2023 (with a 4060 around $399-450).
 
Aren't all those comparisons with RT and DLSS enabled? I'm not all that interested in edge cases like that, I want something more representative.

You mean 3060? There's no mention of any 40-series SKU below the 4080 12GB. And that's just the MSRP for that card, it's been that since day 1. Nvidia is just for some reason promoting that a year and a half after its launch you might actually have a chance at finding one close to MSRP (rather than any kind of price drop, which would be the logical thing to expect that long after launch.

Which just supports my point: if Nvidia is saying "hey, the 3060 is great value at $329" in mid-to-late 2022, then I wouldn't be surprised at all to see them launch a 4060 Ti at $499 or higher in early 2023 (with a 4060 around $399-450).
you are right, my mistake.
I though it was 4060 in the slide..
 
Aren't all those comparisons with RT and DLSS enabled? I'm not all that interested in edge cases like that, I want something more representative.

I think this is the best representation of the least CPU-bound scenario possible. When CPU is a bottleneck, the cards literally show no performance difference.
Very odd.

You mean 3060? There's no mention of any 40-series SKU below the 4080 12GB. And that's just the MSRP for that card, it's been that since day 1. Nvidia is just for some reason promoting that a year and a half after its launch you might actually have a chance at finding one close to MSRP (rather than any kind of price drop, which would be the logical thing to expect that long after launch.

Which just supports my point: if Nvidia is saying "hey, the 3060 is great value at $329" in mid-to-late 2022, then I wouldn't be surprised at all to see them launch a 4060 Ti at $499 or higher in early 2023 (with a 4060 around $399-450).

I think nvidia uses the old series to fill the gaps because of the unlimited supply - it's hard to get rid of the stock. But EVGA said "bye-bye nvidia" :D
 
I think this is the best representation of the least CPU-bound scenario possible. When CPU is a bottleneck, the cards literally show no performance difference.
Very odd.
That's possible, but it's still rather suspect that they're exclusively showing off RT+DLSS titles - though at this point we know that Nvidia cares more about selling their proprietary tech (and thus tying people into their ecosystem) than selling GPUs.
I think nvidia uses the old series to fill the gaps because of the unlimited supply - it's hard to get rid of the stock. But EVGA said "bye-bye nvidia" :D
Well, more like they've got so much stock they can't launch anything new to replace that without having a fire sale to clear out stock. Which would of course eat into their margins, and force them to give retroactive rebates to AIB partners to have them agree to the sale. Nvidia would much rather tout the value of a two-year-old product still not selling at MSRP.

I really, really hope AMD comes out swinging with 7000 series pricing. This just seems like such a massive opportunity for them, regardless of absolute performance.
 
Have fun playing with these with the rising electricity pricing.

Also WTF is that "4080 12GB", that should be named as a 4070 or 4060 Ti?
 
How do you explain the performance gap between 4090 and "4080-16" - 16,384 vs 9,728 shaders?
4090 has 68% more shaders?

This is not normal.

Look:

RTX 3090 Ti - 10,752
RTX 3090 - 10,496
RTX 3080 Ti - 10,240
RTX 3080 - 8,704

3090 Ti has only 23.5% more shaders than 3080.
And 3090 Ti has 2% more shaders than 3090.

Well, actually the 30 series was not normal in this regard. Look at all the previous generations. The x80 Ti usually had ~40% more shaders. And the Titan (full GPU) had about 50% more. But those Ti cards usually came out a year after a normal x80.

The 3090 and Ti were absolutely terrible value over the 3080. 3080 is one of their best cards. And it was the first time since Kepler (7 series) when they used their biggest GPU in a normal x80 card.

The 4090 actually looks to be better value than the 4080s. Its biggest problem is the heat output. I could probably save up to buy one, but the card would be unusable in the summer. Undervolted it will consume 300-350 W, but that is still too much for me.
 
It absolutely isn't. It just establishes the 90 tier as a hyper-exclusive, pie-in-the-sky tier for the ultra wealthy
The 3090, 4090 etc. are for semiprofessionals, creative people, not for gamers, nobody needs 24 GB VRam in gaming. Buts its nice for mixed use and a lot cheaper than the quadro series.

- but at the same time the 80 tier is following its pricing closely, which makes even less sense. This also makes me wonder why the performance differences in Nvidia's slides (not that there was much in terms of performance numbers) were relatively modest.

I gotta say, I don't have much hope for accessibly priced GPUs from Nvidia. With a $1200 80 tier GPU, I wouldn't be surprised at all if the 4000 series gets a $499 or higher 60 tier card.
The only explanation for me: Gamers love to have the "best" card, naming/PR is everything. AMD isn't much better, neither much cheaper. So there is room for a good margin. nVidia will continue, to maximize revenue as long, as AMD is behind in RT/tensor core technology. Clients, who doesn't bother about RT, Dlss, are just fine with AMD. No problem at all.
 
Back
Top