• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

No mention of TDP, and the prices are a slap in the face. I don't care about omniverses, either. What a disappointment!

Portal RTX looks cool, though.
 
For me, too. I always look for the next Radeon generation.
The question is if AMD cares about its customers or not? I mean we don't want an RX 7700 XT for $879 lol
It's going to be interesitng to see the pricing of RDNA3 in November. Brace for some surprises, as there is a flood of GPUs on secondary markets now.
7700XT is expected to perform similarly or better than 3090, which has ~69% better raster in relative performance. If RDNA3 architecture brings this kind of uplift, which it should, then AMD needs to convince people to buy 7700XT instead of used 3090 or new 4070.

If you can buy 3090 for less than $700 on ebay, AMD will need to propose convincing price for 7700XT to attract buyers. It cannot cost more than used 3090, so I expect $600-$650 max. But this is far away, in H1 2023. Things can change a bit.
 
Last edited:

The 4080s look pretty bad in games without ray-tracing. The 3090 Ti is only 25% faster than the 3080, and the "4080-12" is actually slower here. It must be the terrible memory bandwidth, considering how high GPU clocks are.

I will hold off until a game with some insane RT visuals comes out (that I actually want to play). My 3080 is more than enough for rasterization. Maybe they will drop the prices next year. No way they can keep this up after the 30 series is gone.
 
---------------------------------------------------
Judt look at the size of the Aorus, it's just ridiculous!
features-slide-6_videocardz.jpg

3090 Strix vs 4090 Strix :

1663710269225.png
 
The 4080s look pretty bad in games without ray-tracing. The 3090 Ti is only 25% faster than the 3080, and the "4080-12" is actually slower here. It must be the terrible memory bandwidth, considering how high GPU clocks are.

I will hold off until a game with some insane RT visuals comes out (that I actually want to play). My 3080 is more than enough for rasterization. Maybe they will drop the prices next year. No way they can keep this up after the 30 series is gone.

Yeah, it just shows that it is a real 4060 Ti, because the 3060 Ti memory is very similar.
And hey, what is a "4080 - 12" with a 192-bit bus? What's next? A 4060 with 64-bit bus? lol
 
I don't think its going to be that interesting, my guess is that the 7900xt will cut nvdia by $100 and release at $1499 msrp and call it a day.

It really depends on how AMD wants to manage its sales. It has now a golden chance to dismantle nvidia completely and make its lineup DOA.
If I were AMD, I would launch an RX 7600 XT with "4080-12" performance for $649 and call it a day.
 
It really depends on how AMD wants to manage its sales. It has now a golden chance to dismantle nvidia completely and make its lineup DOA.
If I were AMD, I would launch an RX 7600 XT with "4080-12" performance for $649 and call it a day.

NVIDIA has something like 80% of the discrete desktop graphics card market. I doubt that AMD can defeat NVIDIA with one card with sharp pricing.

At the most, AMD has a chance and nibbling away at NVIDIA's market share.

Let's not forget that the NVIDIA GPU chip has excellent adoption as a compute engine (AI, etc.) and that NVIDIA's Gaming business is now smaller than their Data Center business.
 
NVIDIA has something like 80% of the discrete desktop graphics card market. I doubt that AMD can defeat NVIDIA with one card with sharp pricing.

At the most, AMD has a chance and nibbling away at NVIDIA's market share.

Let's not forget that the NVIDIA GPU chip has excellent adoption as a compute engine (AI, etc.) and that NVIDIA's Gaming business is now smaller than their Data Center business.
AMD catching up to Nivida would need to be a concerted multi-year effort, true. But they're not going to get there at all unless they outprice Nvidia.

As for that last point, it has nothing to do with Geforce sales nor desktop GPU marketshare.
 
The lack of "new"DLSS 3.0 on the entire series of RTX cards is a spit in the face of players. I understand that a given series like RTX 2000 and RTX 3000 would differ in performance by 10-30% depending on how strong the card is. Now it seems as if everything was planned from the beginning ? The cryptocurrencies is over and suddenly we have technology available only on the latest series? After all, this is not a technological change like GTX vs RTX. And then maybe it is a pre-planned process? On the one hand, a sudden and huge leap and on the other hand, no (full) support from the same family - RTX ?! It's like showing the middle finger to players.
Have to wait for reliable tests, checking the technology, but it shows that currently the duopoly, previously cryptocurrencies turned out to be deadly for players and gave huge, disproportionate profits for Nvidia, AMD. Playing on PC becomes an abstractio

If it were to be confirmed, I dont fully understand the producers and publishers of games. They limit the potential number of customers with questionable technology. Currently, RT looks like an on / off in terms of lighting, shadows etc. Yet many titles without RT looked great. Then such a technological leap would make sense - but it looks like a lack of support. It's like you buy a new iPhone and after a year your iPhone 12, 13 is no longer usable...for basic functions (performance drop by several dozen percent). With an update, you lose the usefulness of your phone even in a year or two. You have to switch to new products after the premiere of the new series because the life cycle is extremely short.

Someone will say - great, we have a huge performance increase. And I will say - the RTX 2000 series gave a very small increase in performance in games without RT. There were a few issues with RT games and the performance was mediocre. At the same price over the previous series, barely 7-10%, when the GTX 1070-1080Ti cost more, but the performance gains were much greater.
For example
https://www.techpowerup.com/review/nvidia-geforce-gtx-1070/24.html what we get for 379$ ;)
Even the next RTX series as the most efficient Nvidia RTX 3090Ti card costing $ 2,000 in 4K with RT in some games we only had 25-50 frames. For $ 2000! If now the leap in performance was so large only with DLSS 3.0 it would potentially mean that Nvidia gets the profits from cryptocurrencies, didnt have to worry about what players thinkin . One market is over* cryptocurrencies - now in if you want to play, everyone "has" to buy a new series.
 
Reviews are gonna be tricky. No DLSS vs no DLSS, DLSS2 vs DLSS2, & DLSS2 vs DLSS3.

Graphics card reviewers have already been dealing with this for a while.

The better reviewers typically compared a number of games just on pure 3D rasterization. They then add some comparisons with RT on but no enhancements like DLSS or FSR. Finally there are some comparisons between DLSS and FSR, particularly with the handful of titles that support both. With each passing week, there are more gaming titles that support these two technologies.

And we will soon see the additional of XeSS to the mix.

It's not like hardware reviewers are being blindsided by DLSS 3.0.

For the most part, today's graphics cards have enough 3D rasterization performance on a single PCB which is why NVLink wasn't even included on the Ada cards. Super sampling technologies like DLSS, FSR, and XeSS are most beneficial when ray tracing is enabled.
 
No mention of TDP, and the prices are a slap in the face. I don't care about omniverses, either. What a disappointment!
Full specs are on their site - but it's rather telling that you have to go looking for them, and look pretty hard too - you have to click and scroll through several meaningless feature lists to get to some actual specs - making it seem like Nvidia really doesn't want to oversell these cards, at least for now (my guess is they won't be pushing this until 30 series supplies are much lower). Here's everything relevant that they've posted:
1663711782457.png

1663711801813.png

So, stock power limits match the 30 series, though there are strong indication that AIB partner cards will dramatically exceed this. FE coolers are huge chunguses, except for the 12GB which ... doesn't have one? Uh, okay? 4080 12GB should be very, very noticeably slower than the 16GB.
 
AMD catching up to Nivida would need to be a concerted multi-year effort, true. But they're not going to get there at all unless they outprice Nvidia.

As for that last point, it has nothing to do with Geforce sales nor desktop GPU marketshare.

To my knowledge, AMD Radeon has been historically less expensive than comparable GeForce products, at least in the past five years. At least from a performance-per-dollar metric on 3D rasterization, Radeon is a better buy, at least at 1080p and 1440p resolutions. However AMD's position weakens with the 4K and 8K resolutions as well as other features like RT and DLSS/FSR.

With more gaming titles adopting RT and DLSS/FSR (and now XeSS), the performance comparison now takes multiple charts, tables, and graphs. The old paradigm of "run this 3D benchmark utility to compare scores" from five years ago is increasingly obsolete.

And cherry picking one or two games (MS Flight Simulator, Cyberpunk 2077, Red Dead Redemption, whatever) isn't particularly useful. The better analyses use a battery of games to average out architectural advantages and disadvantages of each manufacturer, whether it be DX11 vs. DX12 or other newer features like super sampling.

I don't know about anyone else here but I happen to play some older titles on newer hardware, including games that aren't always part of a graphics card review.
 
I don't think its going to be that interesting, my guess is that the 7900xt will cut nvdia by $100 and release at $1499 msrp and call it a day.
Highest tier market is not going to be interesting for most people, so there is no excitement with 7900XT vs 4090. You are right.
More interesting will be upper mid rage and mid range. Do not forget that AMD will also offer DP 2.0 ports, PCIe 5.0 interface x16 and x8, as well as more VRAM, for more future-proof products.

It really depends on how AMD wants to manage its sales. It has now a golden chance to dismantle nvidia completely and make its lineup DOA.
If I were AMD, I would launch an RX 7600 XT with "4080-12" performance for $649 and call it a day.
It seems to me that it's 7800 that will try to be competitive with 4080 12GB, 7800XT with 4080 16 GB, and 7700XT with 4070.
 
Those prices are garbage
I feel like Nvidia might be losing the plot here. We on TPU (according to the survey) definitely fit into the minority niche group of hardware enthusiasts. Something approaching 99% of the population will ask themselves "should I buy a graphics card for my PC, or should I buy a PS5/XBox? When the GPU costs so much more than a decent console and comes with the additional burden of needing a new PSU as well, it's not really relevant to the vast majority.

If, when the sub-$500 cards are launched, they can be run on a reasonable PSU that someone with a last-gen PC might have (so probably a 650-850W PSU), then we can tell for sure if Nvidia have lost the plot or not.
 
My favourite part is where they decided to call the 4070 the 4080 12GB instead. That will sell more units. Watch the 4060 come out as the 4080 8GB.
 
Full specs are on their site
The specs are kinda useless since Nvidia have once again redefined what units in the architecture count as "one CUDA core". Per CUDA core, Turing was vastly better than Ampere, but the numbers went up regardless. Until we get a full independent review of Lovelace vs Ampere, the core and clock numbers are meaningless because we don't know how potent each core is in relation to the current gen.

My favourite part is where they decided to call the 4070 the 4080 12GB instead. That will sell more units. Watch the 4060 come out as the 4080 8GB.
I'm in the queue for my low-profile, single-slot 4080 4GB DDR4 :D
 
AMD catching up to Nivida would need to be a concerted multi-year effort, true. But they're not going to get there at all unless they outprice Nvidia.
Not just that, but RDNA3 cards will offer PCIe 5.0 connectivity, DP 2.0 and more VRAM. It should be an attractive package. We shall see.
 
Full specs are on their site - but it's rather telling that you have to go looking for them, and look pretty hard too - you have to click and scroll through several meaningless feature lists to get to some actual specs - making it seem like Nvidia really doesn't want to oversell these cards, at least for now (my guess is they won't be pushing this until 30 series supplies are much lower). Here's everything relevant that they've posted:
View attachment 262383
View attachment 262384
So, stock power limits match the 30 series, though there are strong indication that AIB partner cards will dramatically exceed this. FE coolers are huge chunguses, except for the 12GB which ... doesn't have one? Uh, okay? 4080 12GB should be very, very noticeably slower than the 16GB.
I expect around -18% at QHD and a little bit more at 4K.
Reference 3090 with 350W TBP needed 2 PCIe 8pins and 4080 with 320W needs 3?
 
That's what VAT does to you when comparing to US MSRPs, which are always without tax. Germany even has it pretty good at 19%, in Norway and Sweden its 25%.

Still like 1730 usd vs 1950 euro.... At least the 4080 16GB is getting the same cooler as the 4090 and not the super gimped version they used on everything below the 3090 last gen.
 
Last edited:
Full specs are on their site - but it's rather telling that you have to go looking for them, and look pretty hard too - you have to click and scroll through several meaningless feature lists to get to some actual specs - making it seem like Nvidia really doesn't want to oversell these cards, at least for now (my guess is they won't be pushing this until 30 series supplies are much lower). Here's everything relevant that they've posted:
View attachment 262383
View attachment 262384
So, stock power limits match the 30 series, though there are strong indication that AIB partner cards will dramatically exceed this. FE coolers are huge chunguses, except for the 12GB which ... doesn't have one? Uh, okay? 4080 12GB should be very, very noticeably slower than the 16GB.
I guess there's no 4080 12 GB FE, only AIB cards. It really should have been called the 4070. I also guess it would have looked really bad to release a 285 W x70-series card, that's why the two vastly different 4080s. I'm curious what the x60 and x50 tier will look like, and how the 4070 will be positioned if there's gonna be one.

I still hold my position - not impressed.
 
Back
Top