Monday, October 23rd 2023

NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102

NVIDIA's upcoming mid-life refresh for its GeForce RTX 40-series "Ada" product stack sees the introduction of three new SKUs, led by the GeForce RTX 4080 SUPER, as was reported last week. In the older report, we speculated how NVIDIA could go about creating the RTX 4080 SUPER. BenchLife reports that the RTX 4080 SUPER will be given 20 GB as its standard memory size, and will be based on the larger "AD102" silicon. The SKU will utilize a 320-bit wide memory interface carved out of the 384-bit available to the silicon. The "AD102" has 144 streaming multiprocessors (SM) on die, from which the flagship RTX 4090 is configured with 128, and so NVIDIA could pick an SM count that's lower than that of the RTX 4090, while being higher than the 76 of the current RTX 4080.
Sources: Wccftech, BenchLife.info
Add your own comment

145 Comments on NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102

#126
Random_User
theouto1440p dlss quality is not 1080p btw, it's closer to 900p

Also, I am astonished to report that: they all look meh, I am a bit surprised actually, it's higher res sure, but it retains many weaknesses of the native 1080p image, such as this, which looks to be AO stairstepping
(see what I said about it making things not look that much better than you may think?) (dlss 4k perf on the left, 1080p on the right)

(1440p dlss q on left, native 1080p on the middle, 4k dlss perf on the right) I actually find this particle effect to look better on native 1080p than 1440p dlss quality because of what I mentioned before, it looks noticeably more blocky on dlss, it's also why the transparent window looks argueably nicer on native 1080p compared to dlss @1440p, and hell, I'd say it looks better than in the 4k dlss perf image because there it looks incredibly out of place, because it's trying to upscale an effect (the smudgyness of the window) that is dependent on the internal resolution, but still upscales it to 4k in the same way that the particles or ssr would be, so it looks jarring by comparison.

I guess the lines themselves are sharper, sure, but texture quality does not feel quite crystal 4k (which is a problem I've had in many games back when I ran dlss, textures never feeling quite right, now weather it's because of incorrect use of negative bias or just dlss, that's not my problem)
I'd also argue that Cyberpunk 2077 is not the best game to test this with, since that game has easily one of the worst TAA solutions I've seen any game have, spiderman might be a better go tbh, specially with SMAA

Anyway, I think I'll be sticking to 1080p thank you, low res, but not jarring (though the TAA is terrible, so I'll just not look at this game)

This was pointless
The 4K dlss performance looks sharper. Dunno why but for me it also looks like the photo that has lost the focus during camera shot.

Sorry for my input. But...

There's no magic, or wonder. It's impossible to do the better picture with lower efforts/requirements. One can't substitute the original more demanding rendering with inferior upscaling, without additional workloads for "bettering" it afterwards. And yet to have the quality of native rendering.

The whole point of the high end videocards is to show the eye candy of graphics achievements/capabilities? Nope?

I can get if it is used for low end, but... This is ridiculous that newer generations of more powerful cards use upscaling in order to play current games.

Yes, when the Crysis was released back in 2007, there was no DLSS and FSR, as many would prefer these instead of stuttering. But the whole point of better graphics gets lost. Not to mention it requires the extra chip and power consumption, in order to run lower resolution, and then add a bunch of complicated stuff on top, just to see the inferior picture over native rendering.

One may argue, that muh DLSS3.5 looks better than native. But then again, it requires a ton of complicated stuff to run that for the price of more heat and significant latency. And the upscaled rendering loses the ability to load larger textures, which would otherwise be used with bigger native resolution. This is also stupid to upscale each frame real-time, instead of load the textures once per game/level launch.

Not to mention that all this upscaling would make the work of game devs moot, and make them lazier, as there will be no point to put more efforts.

I would rather play with better settings and lower resolution, than use upscaling.
Posted on Reply
#127
AusWolf
fevgatosThe point isnt whether you are going to play at 1440p in a 4K monitor. The point is you can supersample and then use dlss. Say you have a 1440p monitor, you use dldsr to play at 4k and then upscale with dlss. Render resolution is still 1440p, which is your monitors native resolution, but the image quality is way above native.
You're deflecting the point. We were talking about DLSS, not DLDSR.
fevgatosI can post you 2 screenshots running at similar framerate and you tell me which is the native on. Deal?

Also that's not what I said at all, read again. I said 4k dlss q looks better than native 1440p while it performs similarly.
I called bullshit on that once, and if you insist, I will again. With a 1440p monitor, I will not adjust my resolution to target X frame rate. I will target 1440p, and adjust graphics settings to achieve a desirable frame rate.
Posted on Reply
#128
Dr. Dro
Random_UserNobody dies. It's just hurts your personal feelings. since you have too much entanglement with Nvidia corporate mentality. No offence. Nvidia won't cradle you before bed. :laugh:
BTW. As you seems to be an avid NV fan, did you ever seen the Nvidia logo? At least one glance? :rolleyes:

Not to rub your wounds, however...
Now, here goes real punch :D
Unforgivable. Dell and Nvidia literally murder kittens. :(

But i'm far from an avid NV fan, I just had a bad lovers' breakup with AMD lol
Posted on Reply
#129
bug
fevgatosWell here you go, not a blind test since I have the names, but doesn't need a blind test. In my eyes the 4k DLSS performance is by far the best one but performance drops a bit, 1440p DLSS Q and native 1080p have same fps but the DLSS is apparently better. Retains more details on pretty much everything. Sadly even imgbb compressed them a bit, original images were 25mb, imgbb hosted them at 10mb.

i.ibb.co/swbznGb/4k-dlss-performance.png
i.ibb.co/M2RJqs6/1080p-native-1.png
i.ibb.co/JFztSLy/1440p-DLSS-Q-2.png
Ok, got to my PC to check these out on a proper monitor.
There isn't a lot of detail that doesn't change between those scenes, but the neon lights at least look clearly better in the image you labeled 4k DLSS performance.
Posted on Reply
#130
lexluthermiester
dj-electricEveryone who witnessed RTX 40 series cards' initial launch and seen that many of them are made of slightly cut down core complexes of their own full-core configs.
This road has been paved from the very start, really. Competition-less, NVIDIA is used to making 2 product generations out of 1 architecture.
Exactly! This not something people should be surprised about nor whine and complain about.
Posted on Reply
#131
Prima.Vera
Isn't the RTX 50x0 gen supposedly to be launched next year??
Posted on Reply
#132
wNotyarD
Prima.VeraIsn't the RTX 50x0 gen supposedly to be launched next year??
Nope, as far as pretty much all reports point to. 2025 for Blackwell at the earliest.
Posted on Reply
#133
AusWolf
Dr. DroThe question that needs be asked is why is that. Why has Ryzen in the 5 years that it exists as a brand/product/series is considered to be positive and Radeon despite existing since 2000 isn't.
Definitely. All I'm saying is, that preconceived notion has little to do with reality since the RX 6000 series, yet, it still lingers on in the average gamer's mind.
fevgatosWell here you go, not a blind test since I have the names, but doesn't need a blind test. In my eyes the 4k DLSS performance is by far the best one but performance drops a bit, 1440p DLSS Q and native 1080p have same fps but the DLSS is apparently better. Retains more details on pretty much everything. Sadly even imgbb compressed them a bit, original images were 25mb, imgbb hosted them at 10mb.

i.ibb.co/swbznGb/4k-dlss-performance.png
i.ibb.co/M2RJqs6/1080p-native-1.png
i.ibb.co/JFztSLy/1440p-DLSS-Q-2.png
Would you consider playing at 1080p on your 4K screen? Is that why you bought it, to play at 1080p on it with a 4090? I don't think so, that's why this kind of comparison is pointless.
lexluthermiesterExactly! This not something people should be surprised about nor whine and complain about.
Agreed. If you buy a top-end graphics card that comes with a cut-down GPU, expecting it to be the flagship of the generation, don't be surprised when the refresh comes out with the fully enabled die and your "flagship" gets relegated to second place, or even worse, drops in value like a rock.
Posted on Reply
#134
fevgatos
AusWolfWould you consider playing at 1080p on your 4K screen? Is that why you bought it, to play at 1080p on it with a 4090? I don't think so, that's why this kind of comparison is pointless.
I have played on 1440p on my 4k screen and I have played on 1080p on my 1440p screen, back when I still had a 1080ti and DLSS didn't exist. Today no, I don't do that anymore cause I can use DLSS instead.
Posted on Reply
#135
theouto
fevgatosYeah, 1080p native looks obviously better. LOL
Yes, and I mean it, sorry to disappoint, but you are wrong, I didn't pick the dlss image even when performance is very similar, but maybe this helps you to see why people might prefer native rendering, even at a lower resolution.
Posted on Reply
#136
fevgatos
theoutoYes, and I mean it, sorry to disappoint, but you are wrong, I didn't pick the dlss image even when performance is very similar, but maybe this helps you to see why people might prefer native rendering, even at a lower resolution.
Not going to argue, if you think the native looks better more power to you. You happen to disagree with every single reviewer who tested it (and not even with performance equalized) but okay, whatever.

It's very common for people with amd cards to not like dlss.
Posted on Reply
#137
theouto
fevgatosNot going to argue, if you think the native looks better more power to you. You happen to disagree with every single reviewer who tested it (and not even with performance equalized) but okay, whatever.

It's very common for people with amd cards to not like dlss.
Conveniently forget the fact where I mentioned that I have used dlss for years when I had a 2070, and yeah, I disagree with the reviewers, I don't need someone to tell me what to think, but agree to disagree man, we all have different reasons for liking and disliking things
Posted on Reply
#138
fevgatos
theoutoConveniently forget the fact where I mentioned that I have used dlss for years when I had a 2070, and yeah, I disagree with the reviewers, I don't need someone to tell me what to think, but agree to disagree man, we all have different reasons for liking and disliking things
Mentioning something I can't verify isn't a fact. The only fact right now is that every one who has tested it disagrees (violently I might add) with your opinion. Of course that doesn't mean you are wrong, but it's certainly an indication that you probably are.

Add in the fact that the 15-20% of amd owners dislike dlss for whatever reason...
Posted on Reply
#139
TumbleGeorge
I am encouraged by the fact that there are still people who hold their own and do not accept unconditionally, or at all, the trends imposed by companies, some of which serve as justification for selling less real performance and less expensive to manufacture hardware to unreasonable prices.
Posted on Reply
#140
the54thvoid
Intoxicated Moderator
Not seeing much talk about the 4080 speculation. Nothing to contribute beyond what's been said? Then, please, feel free to leave the thread.

This is not about DLSS.
Posted on Reply
#141
SOAREVERSOR
FoulOnWhiteunfortunatley as long as there are people who will pay the outlandish prices, it won't stop.
The prices are not "outlandish".

The raw reality is that the constant screaming for better graphics, physics, and more is the problem. To do that cards go up in cost massively. So each GPU itself is going to cost more each generation just for better graphics. Not only that but the cost of the PCBs and all the components on them are going up.

You want prices to drop? There's a good way to do that. No more graphics advances at all, ever again. Instead demand worse graphics in everything and no more physics at all. Scream at the companies that you want to go back to Doom 3 and Half Life to and never moved beyond that, ever again. Scream that 4k, high refresh, and all of that should go away and never come back. You'll get GPUs then that are super cheap to make!

GPU companies didn't do this. Gamers who care about graphics as the end all be all did this. It's all their fault.
Posted on Reply
#142
gmn 17
stimpy88. I will use my brain instead and wait for the 50x0 series.
Yeah I'll do same as I don't like the 16 pin connector on the 40 series I hope they come up with a better power connector solution for the 50 series.
Posted on Reply
#143
bug
SOAREVERSORThe prices are not "outlandish".

The raw reality is that the constant screaming for better graphics, physics, and more is the problem. To do that cards go up in cost massively. So each GPU itself is going to cost more each generation just for better graphics. Not only that but the cost of the PCBs and all the components on them are going up.

You want prices to drop? There's a good way to do that. No more graphics advances at all, ever again. Instead demand worse graphics in everything and no more physics at all. Scream at the companies that you want to go back to Doom 3 and Half Life to and never moved beyond that, ever again. Scream that 4k, high refresh, and all of that should go away and never come back. You'll get GPUs then that are super cheap to make!

GPU companies didn't do this. Gamers who care about graphics as the end all be all did this. It's all their fault.
No more advances you say? Then what do we do with the billions of extra transistors we can fit within the same space with each node advance?
Of course there is room for improvement without getting overboard. There always was.
Posted on Reply
#144
MentalAcetylide
fevgatosSure but market prices are dictated by sales. If nobody buys amd gpus of course prices will drop. That's not the point. If you want to compare greediness between companies, you have to compare msrps.

That is just not true. 10 years ago Radeon had over 40% of the marketshare. That's because they were making good cards. Now they arent

How is ryzen a highly esteemed name? It's a much newer brand than Radeon. Obviously because they made good products. Radeon has stopped doing that for many many years. Thinking that nvidia sells because of brand name is just full copium.

Starting from the pascal era (2015) amd has been nonstop failing. Too late to the market, not competitive with nvidias high end cards, exorbitant prices, very limited availability. They are not selling cause they are failing on multiple fronts
If it weren't for the fact that there are alternative rendering engines that do not rely on NVidia's iray, they would have a monopoly in the 3d rendering market. In some aspects, it is one of the more superior rendering engines, but more expensive & requiring more "horsepower"(either multi-GPUs or ass loads of CPUs). For now, NVidia has us by the balls.
Posted on Reply
#145
Why_Me
neatfeatguyHuh....

I can't honestly say that I know of anyone that's wandering around complaining of the lack of GPUs that are priced at over $1k and would be excited to have another one added to the list.

As of right now, locally, the cheapest 4080 I can get is $1200 and the cheapest 4090 is $1650. Probably going to see the current 4080 drop to $1100, the 4080 Super come in at $1399 while the 4090 maintains its even more outlandish price.

I miss the pricing of the good old days, I'll just have to keep those fond memories alive and avoid paying for this crap.....980Ti was only $650, I'm just saying.
pcpartpicker.com/product/PGWzK8/gigabyte-eagle-geforce-rtx-4080-16-gb-video-card-gv-n4080eagle-16gd
Posted on Reply
Add your own comment
Jun 2nd, 2024 22:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts