• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

So what does the Ada release by Nvidia mean to you

Has the Ada release by Nvidia been good in your opinion


  • Total voters
    101
Status
Not open for further replies.
You've spent about 1,5 page now talking about numerous upscale techs while the topic is not entirely that. We get it, you love these techs, good on you. Others don't. Let's move on? This is also not a dick comparison between AMD and Nvidia, which you seem adamant to make it.


Here's a little news flash

Ever since AMD owns the two consoles, they ship roughly equal or more gaming GPUs than Nvidia.
And in terms of revenue, consider the fact AMD doesn't have a service to match Geforce NOW either.
Another news flash - Check out Steam Hardware Survey and look at AMDs 15% marketshare (including their mobile GPUs and APUs in general which probably stands for 5% at least)

Top 25 GPUs on Steam is almost entirely dominated by Nvidia. Nothing newly released from AMD is preset. Zero 6000 series. Zero 7000 series. Keep believing they sell well if you want. They don't. Even AMD knows this. This is why they skip high-end GPUs in RDNA4. 99% of AMD buyers are buying their low to mid-end chips. Like usual.

Best selling AMD cards always been the dirt cheap ones. They need another RX480/470/580/570 soon, and this is what RDNA4 will focus on.
 
Another news flash - Check out Steam Hardware Survey and look at AMDs 15% marketshare (including their mobile GPUs and APUs in general which probably stands for 5% at least)

Top 25 is almost entirely dominated by Nvidia. Nothing new from AMD is preset. Zero 6000 series. Zero 7000 series. Keep believing they sell well if you want. They don't.
Since when does the Steam survey include console GPUs?
 
These screenshots look just about the same to me. But then, the questions that follow are 1. How different is your experience in game, and 2. Do you really want a 1080p-like experience on a 4K screen?
Well 8k and 320p might look the same to you, don't know what to tell you man. The 1080p is way more pixelated and is missing detail, way more aliased etc. I assume because you have a 1080p screen, you can't see the difference in pixel count unless you zoom in. Try that
 
Since when does the Steam survey include console GPUs?
I'm not exactly an expert on AMD, but isn't anything with a g behind it, an apu? I don't get where consoles came into the picture.
 
Well 8k and 320p might look the same to you, don't know what to tell you man. The 1080p is way more pixelated and is missing detail, way more aliased etc. I assume because you have a 1080p screen, you can't see the difference in pixel count unless you zoom in. Try that
Just one question (I might be getting something wrong here): why are both of your screenshots 4K in resolution? The one labelled "1080p" does look more blurry when zoomed in, though.
 
Since when does the Steam survey include console GPUs?
We talk PC gaming, duh. And console's use APUs not GPUs ;)

And btw, the winners of the console market is Sony and Microsoft, not AMD. Their dirt cheap APUs don't make them much money. Console revenue comes from software sales and accessories + subscriptions. The hardware itself is pretty weak and cheap. Nothing new. This is why dev's spends time on optimization which is easy when exact hardware is known, unlike PC.

Nvidia have Switch and soon Switch 2 and I bet they make the same or more from selling this weak APU compared to AMD selling chips to Sony and MS.
Nvidia never really cared about console market because it's not that profitable (for those that deliver the chips, that is).

However Nintendo went with Nvidia again to get DLSS support on Switch 2 because it will be insanely helpful for a weak device like this.
Nintendo closes in on 135 million Switch units sold soon. Right now has 3rd place in best selling consoles, ever. Might take the top spot in a year or two, beating DS and PS2.

Steam went with AMD for Steam Deck and it already struggles in many games and FSR looks horrible in comparison making it pretty much useless. Soupy graphics without proper upscaling. VSR is mediocre as well, meaning it can't output to 2nd monitor without struggling hard, because the chip barely does 800p as it is. Demanding games literally runs like crap, I have one. Mostly uses it for emulation.

Nintendo can use DLSS on Switch 2 to make the lower res display look great without performance drops off a cliff and drain the battery.
It will also make them do 1440p-4K in docked mode with weaker hardware using DLDSR with DLSS on top.

So yeah, Nintendo went with the gimmicky DLSS over a cheaper AMD solution. For good reason = Superior upscaling, downsampling and anti aliasing. This will matter more for Nintendo and the consumers.
 
Last edited:
Just one question (I might be getting something wrong here): why are both of your screenshots 4K in resolution? The one labelled "1080p" does look more blurry when zoomed in, though.
Because they are both upscaled to 4k. One is using tradianiotal nearest neighbor technique and the other one dlss
 
I'm not exactly an expert on AMD, but isn't anything with a g behind it, an apu? I don't get where consoles came into the picture.
We talk PC gaming, duh. And console's use APUs not GPUs ;)

And btw, the winners in the console market is Sony and Microsoft, not AMD. Their dirt cheap APUs don't make them much money. Console revenue comes from software sales and accessories + subscriptions.
Did you guys read the linked article, seriously? It doesn't look like you did.

Because they are both upscaled to 4k. One is using tradianiotal nearest neighbor technique and the other one dlss
If it's upscaled, then it's not really 1080p, is it?
 
These screenshots look just about the same to me. But then, the questions that follow are 1. How different is your experience in game, and 2. Do you really want a 1080p-like experience on a 4K screen?
Did you expand the image? 4K DLSS is a LOT less blurry
 
If it's upscaled, then it's not really 1080p, is it?
Yes it is. They are both 1080p upscaled to 4k

Upscaling doesn't change the image quality, it just makes it fit into a higher resolution monitor. Since I'm running 4k....
 
Did you expand the image? 4K DLSS is a LOT less blurry
I can see that after magnifying. But then, both screenshots are 4K resolution, so we're comparing two different ways to upscale, not 1080p native.
 
Did you guys read the linked article, seriously? It doesn't look like you did.
Sorry, it didn't show in the quote. I see what you're saying. And I agree. Steam isn't a good way to measure GPU & APU revenue as a whole.
 
Nobody is telling you that DLSS is a gimmick.
Nobody, except you multiple times?
I don't care what gimmicks features Nvidia is trying to push down my throat
We're still talking about gimmicks in my opinion
I don't care whose shit is worth what. I'm not gonna pay extra for features that I don't care about, features that make my 1080p image look like crap, features that are for streamers, features that are supposed to justify simple cards with small GPUs and little VRAM being sold at premium prices, features that I call gimmicks because they are just that. Simple.
I didn't think I imagined it.
I only said that I consider it that, as scrapping tensor cores, then using the freed-up die real estate for more raster and/or RT hardware could result in better native resolution performance and less need for upscaling to begin with
Last time it was measured (as best it could be), the difference in die area of the ALUs, schedulers and cache, it puts Tensor and RT cores combined at 8-10% die area, Tensor alone is about 2/3rds of that 8-10%. I respect your opinion but politely disagree, I think 8-10% die area for RT/Tensor cores, given how they're being innovatively leveraged, is an excellent use, rather than ~8-10% (perhaps) more Raster perf.

It seems that most of the time I've heard the word Gimmick or Feature used in relation to RT and DLSS, which one is used seems to align quite closely again with owners and non owners of RTX hardware.

And it's very relevant to Ada I think, pricing can be so variable depending on region that these extra "feature-gimmicks" can absolutely factor in for those on the fence buyers. Clearly not for yourself, and clearly yes for myself, for example, and it's evident that it's viewed as a feature by buyers.
 
I can see that after magnifying. But then, both screenshots are 4K resolution, so we're comparing two different ways to upscale, not 1080p native.
Just think about what you are saying for a second. If 1080p native looked better than upscaled, why the heck would people come up with the idea of upscaling anything? To make it look worse? I mean come on man......
 
Performanceuplift if you disregard 4060 and 4060ti has been good, efficiency is impressive, but pricewise it is terrible.
 
Upscaling doesn't change the image quality
It certainly does when sharpening, or other filters are applied.

Just think about what you are saying for a second. If 1080p native looked better than upscaled, why the heck would people come up with the idea of upscaling anything? To make it look worse? I mean come on man......
I get what you mean, but I don't take upscaling at face value, which you probably have guessed. ;)

Besides, my earlier point was that X resolution native looks better than X resolution + upscaling. So if you're playing at 4K, then 4K + no DLSS looks better than 4K + any DLSS.

Nobody, except you multiple times?
We're still talking about gimmicks in my opinion.
What I told you is my honest opinion.

Last time it was measured (as best it could be), the difference in die area of the ALUs, schedulers and cache, it puts Tensor and RT cores combined at 8-10% die area, Tensor alone is about 2/3rds of that 8-10%. I respect your opinion but politely disagree, I think 8-10% die area for RT/Tensor cores, given how they're being innovatively leveraged, is an excellent use, rather than ~8-10% (perhaps) more Raster perf.
If that's true, then fine, only 8-10% more performance isn't worth it.
 
Did you guys read the linked article, seriously? It doesn't look like you did.


If it's upscaled, then it's not really 1080p, is it?
Yes it is. Upscaling does exactly that.
1080p upscaled to 4K/UHD, is 4K/UHD output resolution.
4K/UHD downsampled to 1080p, is 1080p output resolution. This is how a 1080p display can get immensely better visuals. Because native 1080p is not really crisp and sharp looking.

This is what consoles do ALL THE TIME, in pretty much ALL GAMES = Dynamic resolution on a fixed resolution, using upscaling.

Tons of PS5/XSX games runs 4K/UHD natively, yet games dynamicly goes from 1080p to 2160p on the fly. Hell some even drop below 1080p. Especially the cheaper Xbox, can drop to 720p in some games.

Upscaling, downsampling and anti-aliasing is insanely important and Nvidia have the best features for this.
 
Last edited:
Besides, my earlier point was that X resolution native looks better than X resolution + upscaling. So if you're playing at 4K, then 4K + no DLSS looks better than 4K + any DLSS.
That's also not true (according to 10s of reviewers mind you) but, it's kinda irrelevant. And I'm saying it's irrelevant cause in these cases dlss increases your framerate, which I assume was the goal of enabling it. If you don't need the fps then you do dldsr + dlss and send native packing.

It just doesn't make much sense to play natively anymore, either in regards to performance or image quality.
 
Yes it is. Upscaling does exactly that.
1080p upscaling to 4K/UHD, is 4K/UHD.

This is what AMDs consoles do ALL THE TIME, in pretty much ALL GAMES -> Dynamic resolution on a fixed resolution.

Tons of PS5/XSX games runs 4K/UHD natively, yet games dynamicly goes from 1080p to 2160p on the fly.
In case anyone wonders why I only game on PC (besides the fact that I'm too clumsy to use a controller properly)...
 
That's the crux of it. There's not much comparison to do looking at screenshots on my 1080p screen, but that's the best I can do at the moment. Like I said earlier, I might be looking at a curved ultrawide at some point, I just don't want to risk not having enough performance to use it at native in every game and/or ending up not liking how upscaling looks on it (for which there is a chance considering that I definitely don't like upscaling at 1080p).
That's the core of the issue to me as well. You simply cannot count on it, you cannot count on the performance boost; you cannot count on the latency being right, you cannot count on having a great implementation that won't bother you compared to native, or shimmers.

However, we have to admit by now that 'native' (which is native +TAA) isn't always the most reliable thing anymore either. The waters are muddy. All the more reason to not count on upscale 'perf' to have enough for what you want to play though... But that's quickly going to get into personal preferences; any blanket statement about upscale always better is omitting so many aspects of it, I just can't.

But even now in Starfield. FSR2... honestly... yesterday I forgot it was on. Its pretty much flawless there. I can get where people are coming from when they've used DLSS in that way for a while.

I remember adding mine to cart and half hoping it would sell out before I got to checkout. :laugh:

Although i was a bit higher on the 4080 and was mildly tempted to grab the FE model but kept looking at 4k RT results in Cyberpunk to talk me out of it lol.
That internal dialogue when buying a GPU... its an amazing thing isn't it. I recognize this strongly :D

I'm pretty sure that's not true. Switch on its own probably outsells both consoles and all desktop amd gpus
So the yearly revenue numbers from graphics divisions are lying now because you assume so? Ok. You do you ;)
 
Last edited:
That's also not true (according to 10s of reviewers mind you) but, it's kinda irrelevant. And I'm saying it's irrelevant cause in these cases dlss increases your framerate, which I assume was the goal of enabling it. If you don't need the fps then you do dldsr + dlss and send native packing.

It just doesn't make much sense to play natively anymore, either in regards to performance or image quality.
Let's just agree to disagree and call it a day, shall we? :)

I obviously won't move from thinking of upscaling as a helping hand in low performance situations, and you obviously won't move from thinking of it as some kind of magic.
 
I'm not exactly an expert on AMD, but isn't anything with a g behind it, an apu? I don't get where consoles came into the picture.
No, although for the longest time Intel APU's (HD graphics) were the most popular gaming machines on Steam IIRC.
 
In case anyone wonders why I only game on PC (besides the fact that I'm too clumsy to use a controller properly)...

Dynamic 4K can easily look better than 1080p native, this is why they do it. 1080p is the worst case for PS5 and XSX, most often they run 1440p-1800p while reaching the full 2160p/4K/UHD sometimes. 1080p native would look worse 95% of the time (when internal resolution goes higher than 1080p)

You need to get rid of the mindset that "native" is always better. It is not always better and especially not when you think in AA solutions and sharpening -> Once again = https://www.rockpapershotgun.com/outriders-dlss-performance
DLSS Quality here, makes the visuals better, not worse. Meaning more details, less shimmering, less jaggies.

If you refuse to use features like this, you are stuck with TAA or other old AA methods - which is worse than DLAA by far and even DLSS - and sometimes FSR - often beats it (while delivering a performance bump on top). FSR have alot worse AA than DLSS tho. FSR has more shimmering and jaggies (read any DLSS vs FSR comparison).

DLAA beats them all with ease in terms of AA and visuals. That is the whole point of DLAA. Superior Anti Aliasing, nothing more. It beats EVERY AND ALL other AA methods and since it is a DLSS preset now, tons of games have it and all DLSS games will have it going forward. Nvidia changed this long ago, to increase adoption rate meaning DLAA is present in MANY GAMES.

A 1080p panel running DLAA looks closer to 1440p than 1080p, in terms of details and sharpness.
A 1080p panel with 4K downsampled using DLDSR looks closer to 1800p or even native 4K than 1440p. Offering far superior visuals than native 1080p.

This is the magic of (proper) upscaling and downsampling.

Your native resolution means very little, the visuals you actually see on screen is what matter and this is where stuff like DLSS/DLDSR/DLAA/FSR/CAS matter. You can use a 720p display, enable DLDSR and get close to 1440p visuals - WITHOUT actually having the performance to run 1440p natively.

I have played tons of games using 4K DLDSR on my 1440p monitor. It looks close to native 4K which my OLED TV does. And the best part is that pretty much all games support it, because it just shows up as a new resolution in games, meaning i can actively choose 3840x2160 in games, using a 2560x1440 monitor. Games don't need DLSS support at all for it to work.
 
Last edited:
Just think about what you are saying for a second. If 1080p native looked better than upscaled, why the heck would people come up with the idea of upscaling anything? To make it look worse? I mean come on man......
Graphics development has ALWAYS been about magic sauce to remove detail where it wasn't 'truly needed' and save that performance for detail that is more readily in view. What's happening with upscale is nothing new in the space at all. At the same time, all these technological advances have radically changed the image nonetheless. In the end its presentation of a bunch of pixels and our mind making it into a piece of film we can interpret well.

A lot of personal preference comes into this, especially when we're mulling about nearly the quality of individual pixels which is where we are now. Much like extremely high refresh screens way beyond 120hz, these techs are in the area of diminishing returns. Motion resolution for example: some people can't stand a game without motion blur, others can't play a game with motion blur. Its a simple graphical effect that we have almost everywhere, and yet, a LOT of gamers insta turn it off. It even relates to the kind of monitor you have and how fast it is. Resolution is another: more detail isn't always more preferable to resolve for your mind/eyes. More isn't always better. Less info on screen = you respond faster to what happens for example.
 
Graphics development has ALWAYS been about magic sauce to remove detail where it wasn't 'truly needed' and save that performance for detail that is more readily in view. What's happening with upscale is nothing new in the space at all. At the same time, all these technological advances have radically changed the image nonetheless. In the end its presentation of a bunch of pixels and our mind making it into a piece of film we can interpret well.

A lot of personal preference comes into this, especially when we're mulling about nearly the quality of individual pixels which is where we are now. Much like extremely high refresh screens way beyond 120hz, these techs are in the area of diminishing returns. Motion resolution for example: some people can't stand a game without motion blur, others can't play a game with motion blur. Its a simple graphical effect that we have almost everywhere, and yet, a LOT of gamers insta turn it off. It even relates to the kind of monitor you have and how fast it is. Resolution is another: more detail isn't always more preferable to resolve for your mind/eyes. More isn't always better. Less info on screen = you respond faster to what happens for example.
Saying that more detail isnt always better, although I disagree, doesn't answer the question of whether or not dlss improves image quality or not, which was what we were arguing about. You might not prefer more detail, but that wasn't the argument.

To be honest it's not really worth having a debate about it, whoever disagrees about dlss improving image quality is doing so in bad faith imo. Which is the case with everything nvidia does...
 
Status
Not open for further replies.
Back
Top