Monday, October 23rd 2023

NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102

NVIDIA's upcoming mid-life refresh for its GeForce RTX 40-series "Ada" product stack sees the introduction of three new SKUs, led by the GeForce RTX 4080 SUPER, as was reported last week. In the older report, we speculated how NVIDIA could go about creating the RTX 4080 SUPER. BenchLife reports that the RTX 4080 SUPER will be given 20 GB as its standard memory size, and will be based on the larger "AD102" silicon. The SKU will utilize a 320-bit wide memory interface carved out of the 384-bit available to the silicon. The "AD102" has 144 streaming multiprocessors (SM) on die, from which the flagship RTX 4090 is configured with 128, and so NVIDIA could pick an SM count that's lower than that of the RTX 4090, while being higher than the 76 of the current RTX 4080.
Sources: Wccftech, BenchLife.info
Add your own comment

145 Comments on NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102

#76
fevgatos
AusWolfI'm not comparing greediness. I just offered an explanation of the current market conditions. Like I said, Nvidia is often times 25% more expensive than the equivalent Radeon, but people still buy it because it's green.


That's true up to RDNA 2. Both 6000 and 7000 series cards are good products at reasonable prices and with wide availability. The majority still wants Nvidia because "DLSS is soooo much better at making my game look like crap, muh! Hardware Unboxed said so with their pixel-by-pixel analysis that no gamer in their right mind ever does at home, but still!"
It was definitely not true for rdna 2, at least not in Europe. And I'm talking about big retailers in EU like mindfactory, alternate etc. You couldn't even find LISTINGS in EU shops for rdna2 months after their launch. The first cards on stock started showing up around February of 2021, and that's when of course mining took off.

And just because you don't like dlss for whatever reasons even though it's straight up better than native, why do you expect other people to feel the same?
Posted on Reply
#77
dj-electric
lexluthermiesterBTW, who called it, eh?
Everyone who witnessed RTX 40 series cards' initial launch and seen that many of them are made of slightly cut down core complexes of their own full-core configs.
This road has been paved from the very start, really. Competition-less, NVIDIA is used to making 2 product generations out of 1 architecture.
Posted on Reply
#78
AusWolf
fevgatosIt was definitely not true for rdna 2, at least not in Europe. And I'm talking about big retailers in EU like mindfactory, alternate etc. You couldn't even find LISTINGS in EU shops for rdna2 months after their launch. The first cards on stock started showing up around February of 2021, and that's when of course mining took off.
Okay, fair enough. I'll give you that one. It doesn't explain current conditions, though.
fevgatosit's straight up better than native
Riiiight... :roll:
Posted on Reply
#79
the54thvoid
Intoxicated Moderator
fevgatosmuch higher RT performance
That's actually a fallacy I called out in the reviews for the 4xxx. The RT performance hit on ADA is practically the same as Ampere. Only the 4090 appears to get clear. You need to look at each game where the results vary, but the drop in performance on a 3080 using RT is around 48-50%. For the 4080 it's also in that ballpark. The other cards stack the same. Apart from the 4090 (arguably the best GPU ever), the hit is the same transferring to ADA from Ampere. RT fps only improves because the base performance (rasterisation) increases, but the actual hit is the same.
Posted on Reply
#80
fevgatos
the54thvoidThat's actually a fallacy I called out in the reviews for the 4xxx. The RT performance hit on ADA is practically the same as Ampere. Only the 4090 appears to get clear. You need to look at each game where the results vary, but the drop in performance on a 3080 using RT is around 48-50%. For the 4080 it's also in that ballpark. The other cards stack the same. Apart from the 4090 (arguably the best GPU ever), the hit is the same transferring to ADA from Ampere. RT fps only improves because the base performance (rasterisation) increases, but the actual hit is the same.
What difference does that make? I don't care about the % drop of performance, I care about the framerate on games worth playing with rt. And in those sadly amd cards are just 1 or 2 generations behind. Someone might not care about rt and that's fine, but those are the facts right now.
Posted on Reply
#81
AusWolf
the54thvoidThat's actually a fallacy I called out in the reviews for the 4xxx. The RT performance hit on ADA is practically the same as Ampere. Only the 4090 appears to get clear. You need to look at each game where the results vary, but the drop in performance on a 3080 using RT is around 48-50%. For the 4080 it's also in that ballpark. The other cards stack the same. Apart from the 4090 (arguably the best GPU ever), the hit is the same transferring to ADA from Ampere. RT fps only improves because the base performance (rasterisation) increases, but the actual hit is the same.
That's why I call Ada Ampere 2.0 and Ampere Turing 2.0. There has been zero change in the base architecture since 2018, and I fail to see what makes the new generation of RT cores actually new.
Posted on Reply
#82
fevgatos
AusWolfOkay, fair enough. I'll give you that one. It doesn't explain current conditions, though.


Riiiight... :roll:
Well see just because you disagree doesn't make it true. Or course that works both ways. That's why I'm willing to offer you a blind test, I'll post a game at native 1440p and then the same at 4k with dlss q. Same internal resolution, same performance, the dlss one will look so much better youd throw native into the trash it belongs to.
Posted on Reply
#83
bug
fevgatosCall me silly but in my mind, the company that has 10-15-20% market share is the one that should be trying to compete - lower prices. Not the one that has 80%. When nvidia knows (and they know, cause it has happened for the last 6-7 years now) whatever they release, at whatever price, people will buy it BECAUSE AMD will offer an inferior product for a similar amount of money (okay, maybe with a 50€ discount) what is the incentive for nvidia to lower it's prices? This gen was extra fascinating, cause some nvidia cards on top of having more and better features and lower power draw, much higher RT performance, they even had better Raster performance per dollar, which is absolutely absurd. Check the launch prices of 4070ti vs the 7900xt, the 4070ti had higher raster per $. But people feel the need to blame nvidia even when amd is doing even worse in some cases.
No, no, no. You got it all wrong.

You see, Nvidia is the bad guy here (that's axiomatic, doesn't need a demonstration). AMD, doing the exact same things as Nvidia, is only forced to do so (again, axiomatic), thus, while doing the exact same things, they are obviously the good guys.

Now, go write that down 100 times so it sticks to your brain so you won't make further silly comments on the Internet, ok? ;)
Posted on Reply
#84
stimpy88
the54thvoidThat's actually a fallacy I called out in the reviews for the 4xxx. The RT performance hit on ADA is practically the same as Ampere. Only the 4090 appears to get clear. You need to look at each game where the results vary, but the drop in performance on a 3080 using RT is around 48-50%. For the 4080 it's also in that ballpark. The other cards stack the same. Apart from the 4090 (arguably the best GPU ever), the hit is the same transferring to ADA from Ampere. RT fps only improves because the base performance (rasterisation) increases, but the actual hit is the same.
NOBODY that I'm aware of has actually gone in-depth and fully compared RT perf clock for clock on different nVidia GPU's. It seems funny to me when nVidia bang on about 3x RT performance boost, yet the penalty for enabling RT is always around the same. If the RT cores were really 3 times faster over the previous generation, then surely in the same game, same settings, enabling RT would have less of a performance impact, as for sure the IQ has not improved, so nothing extra is being done, yet the performance penalty is virtually the same.

I have always smelled shenanigans when nVidia markets is RT performance, ever since the 30x0 series came out. It's also made more perplexing that a Radeon GPU with no or limited HW RT performs as well as it does compared to nVidias offerings with its "state of the art RT cores"... I never trust nVidia marketing, and the whole HW RT core makes me suspicious. It makes me think that HW RT is more like SW RT with some HW assist.

EDIT:
FYI - I was just looking into nVidias perf claims for their RT cores. The 30x0 series is listed as being x2 the RT perf of the 20x0 series. The 40x0 series is listed as having x2 the RT perf of the 30x0 series...
Posted on Reply
#85
AusWolf
fevgatosWell see just because you disagree doesn't make it true. Or course that works both ways. That's why I'm willing to offer you a blind test, I'll post a game at native 1440p and then the same at 4k with dlss q. Same internal resolution, same performance, the dlss one will look so much better youd throw native into the trash it belongs to.
No. Post a screenshot at 4K native vs 4K DLSS Q. Or post it at 1440p native vs 1440p DLSS Q. Nobody with a 4K monitor plays at 1440p ever.

Edit: I have to add, I had or have no intention to turn the conversation into an AMD vs Nvidia comparison in any way. You did that yourself, like you always do for some unknown reason.
Posted on Reply
#86
stimpy88
AusWolfNo. Post a screenshot at 4K native vs 4K DLSS Q. Or post it at 1440p native vs 1440p DLSS Q. Nobody with a 4K monitor plays at 1440p ever.

Edit: I have to add, I had or have no intention to turn the conversation into an AMD vs Nvidia comparison in any way. You did that yourself, like you always do for some unknown reason.
If you have a small monitor or poor eyesight, then some will view DLSS as "free performance"...

I certainly do not see DLSS as anything other than a tool to increase performance because your card lacks performance, at the expense of image quality.

DLSS is unusable for me, as it looks like crud on my 50" 4K screen, and while fine and arguably a great feature for lower end cards, I see it as unacceptable that this is now starting to be mandated on cards costing a lot over a thousand dollars. DLSS has turned into a performance crutch that game devs are now exploiting, with nVidias blessing after realising that they can shift more low-end product for a higher price and higher profits.
Posted on Reply
#87
fevgatos
AusWolfNo. Post a screenshot at 4K native vs 4K DLSS Q. Or post it at 1440p native vs 1440p DLSS Q. Nobody with a 4K monitor plays at 1440p ever.

Edit: I have to add, I had or have no intention to turn the conversation into an AMD vs Nvidia comparison in any way. You did that yourself, like you always do for some unknown reason.
Why would I post a screenshot where one is running at 100 fps and the other one at 140? That's not a fair comparison cause a static image doesn't take framerate into account. A proper comparison is done by testing at same framerate, and at same framerate dlss absolutely smashes native rendering.
Posted on Reply
#88
AusWolf
fevgatosWhy would I post a screenshot where one is running at 100 fps and the other one at 140? That's not a fair comparison cause a static image doesn't take framerate into account. A proper comparison is done by testing at same framerate, and at same framerate dlss absolutely smashes native rendering.
You said "it's straight up better than native" which is not the same as "it runs faster because it looks worse".
Posted on Reply
#89
stimpy88
fevgatosand at same framerate dlss absolutely smashes native rendering.
Are you saying what I think your saying here? :eek:
Posted on Reply
#90
fevgatos
AusWolfYou said "it's straight up better than native" which is not the same as "it runs faster because it looks worse".
Ιt is straight up better than native when you compare properly, ie. iso framerate.

It's also better than native in most games even when you compare them the way you do.
Posted on Reply
#91
AusWolf
fevgatosΙt is straight up better than native when you compare properly, ie. iso framerate.
It seems our definitions of the word "properly" are entirely different. I'll leave it at that.
fevgatosIt's also better than native in most games even when you compare them the way you do.
Erm, no.
Posted on Reply
#92
fevgatos
AusWolfIt seems our definitions of the word "properly" are entirely different. I'll leave it at that.


Erm, no.
Well you can say no and I can say yes that's why blind tests exist.

The whole point of dlss is that you can get better image quality with similar performance to native. So in order to properly test if that is the case, you have to equalize framerate. Its not really something to argue about. It is what it is
Posted on Reply
#93
AusWolf
fevgatosWell you can say no and I can say yes that's why blind tests exist.
Which I've seen and done a few of and then drawn my conclusion.
fevgatosThe whole point of dlss is that you can get better image quality with similar performance to native. So in order to properly test if that is the case, you have to equalize framerate. Its not really something to argue about. It is what it is
No. The point of DLSS is that you can make your game run faster with a slight loss in image quality. With a 4K monitor, you will never ever in your whole life play any game at 1440p, so that comparison is utterly and entirely pointless.
Posted on Reply
#94
bug
fevgatosWell you can say no and I can say yes that's why blind tests exist.

The whole point of dlss is that you can get better image quality with similar performance to native. So in order to properly test if that is the case, you have to equalize framerate. Its not really something to argue about. It is what it is
I think there are two issues here.
1. Straight up image quality. If you look at still images, frame rates are rather irrelevant and DLSS may or may not look better than native, depending on its training.
2. Gameplay. This is where DLSS will falter more, when things get into motion. Frame rates definitely matter here. But keep in mind artifacting and ghosting in motion can happen even in the absence of DLSS.

Imho, this is all largely irrelevant. Why? Because there are two types of games: fast-paced and non fast-paced. Fast-paced games can exhibit the most problems, but at the same time you are more unlikely to spot them in the heat of the action. Unless there's annoying flickering or smth like that. In that case turn off DLSS, lower details, no way around that. For games that aren't so fast faced, you can get by with 60fps or even less, so turn off DLSS if you pixel-peep.
Posted on Reply
#95
fevgatos
AusWolfWhich I've seen and done a few of and then drawn my conclusion.


No. The point of DLSS is that you can make your game run faster with a slight loss in image quality. With a 4K monitor, you will never ever in your whole life play any game at 1440p, so that comparison is utterly and entirely pointless.
The point isnt whether you are going to play at 1440p in a 4K monitor. The point is you can supersample and then use dlss. Say you have a 1440p monitor, you use dldsr to play at 4k and then upscale with dlss. Render resolution is still 1440p, which is your monitors native resolution, but the image quality is way above native.
bugI think there are two issues here.
1. Straight up image quality. If you look at still images, frame rates are rather irrelevant and DLSS may or may not look better than native, depending on its training.
2. Gameplay. This is where DLSS will falter more, when things get into motion. Frame rates definitely matter here. But keep in mind artifacting and ghosting in motion can happen even in the absence of DLSS.

Imho, this is all largely irrelevant. Why? Because there are two types of games: fast-paced and non fast-paced. Fast-paced games can exhibit the most problems, but at the same time you are more unlikely to spot them in the heat of the action. Unless there's annoying flickering or smth like that. In that case turn off DLSS, lower details, no way around that. For games that aren't so fast faced, you can get by with 60fps or even less, so turn off DLSS if you pixel-peep.
The issue here is that as you mentioned yourself, there are games that falter in motion at native. Eg tlou has flickering on native,and so does starfield. People act like native is the end all be all when in fact in most cases dlss looks straight up better even in motion while increasing your framerate.
Posted on Reply
#96
Dimitriman
theoutoMost likely price hikes, don't be surprised if the 4090 surges in price
well, I'm not gonna buy it. If others enable Nvidia, what can we do?
Posted on Reply
#97
stimpy88
fevgatosThe whole point of dlss is that you can get better image quality with similar performance to native.
Upscaled 1080p/1440p to 4K is BETTER quality than native 4K rendering.

Wow, ok, that's one hell of a statement right there! I need some time for your statement to sink in...
Posted on Reply
#98
bug
fevgatosThe issue here is that as you mentioned yourself, there are games that falter in motion at native. Eg tlou has flickering on native,and so does starfield. People act like native is the end all be all when in fact in most cases dlss looks straight up better even in motion while increasing your framerate.
At the end of the day, DLSS is just another tool in the GPU's toolbox. Use it if you will, or don't. Just don't tell me having one more tool is a bad thing. That's the attitude (not yours) that I don't get.
Posted on Reply
#99
fevgatos
stimpy88Upscaled 1080p/1440p to 4K is BETTER quality than native 4K rendering.

Wow, ok, that's one hell of a statement right there! I need some time for your statement to sink in...
I can post you 2 screenshots running at similar framerate and you tell me which is the native on. Deal?

Also that's not what I said at all, read again. I said 4k dlss q looks better than native 1440p while it performs similarly.
Posted on Reply
#100
stimpy88
fevgatosI can post you 2 screenshots running at similar framerate and you tell me which is the native on. Deal?

Also that's not what I said at all, read again. I said 4k dlss q looks better than native 1440p while it performs similarly.
I.) I can run it on my own system, and have.
2.) I quoted you.
Posted on Reply
Add your own comment
May 31st, 2024 23:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts