• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

So what does the Ada release by Nvidia mean to you

Has the Ada release by Nvidia been good in your opinion


  • Total voters
    101
Status
Not open for further replies.
Yes, it's a clear demonstration that upping your resolution while using dlss improves image quality considerably compared to native while getting similar performance.
Compared to native 720p... Well, I'll be damned! :wtf:
 
Compared to native 720p... Well, I'll be damned! :wtf:
But they are both 720p. The resolution doesn't change anything, 1440p + dlss q looks better than 1080p, 4k + dlss q looks better than 1440p.

I'm not sure why you are objecting to something so obvious....
 
If I ever upgrade, it'll probably be a curved 3440x1440 ultrawide. I don't see any point in getting a new monitor only for its resolution.

Same here, if I ever upgrade from my entry level 21:9 2560x1080 75 Hz monitor then its gotta be another Ultrawide with a better panel than what I currently have.
Don't really care about refresh rate nor higher resolutions cause for me as a budget-mid range user those would be a bit difficult to drive in new games and I aint upgrading GPUs that often.

Personally I'm fine with upscaling techs like DLSS but if FSR 3 will be any decent then I would have no issues going back to AMD whenever I upgrade from my 3060 Ti. 'that wont happen anytime soon..'

And yea some ppl just love to force their ways/opinions on others like its THE only way. I've also played on a 1080p/60 Hz monitor for years and I've had no issues with it whatsoever.
 
But they are both 720p. The resolution doesn't change anything, 1440p + dlss q looks better than 1080p, 4k + dlss q looks better than 1440p.

I'm not sure why you are objecting to something so obvious....
Because it's not so obvious. Image quality depends a lot on personal taste, it always has. I'm not sure why you feel the need to argue about it.

Or have you never heard of people who like playing on low quality because "it feels more crisp than ultra", or people who like turning off things like motion blur or lens flare because "it's annoying"? (I'm not one of those people, just asking)
 
Last edited:
Because it's not so obvious. Image quality depends a lot on personal taste, it always has. I'm not sure why you feel the need to argue about it.
And what does your personal taste tell you about the screen shots posted above?
 
And what does your personal taste tell you about the screen shots posted above?
That 4K is better than 720p. That by using deep learning algorithms, you can make a shit image a bit less shit.

Before you ask, it tells me absolutely nothing about buying a new monitor (especially since the 4K+DLSS/UP mode looks blurry as heck).

Edit: If "DLSS ultra performance" works with a 720p input image at 4K, then it's basically the same as "DLSS quality" at 1080p, which is... (you guessed it)... blurry.
 
Last edited:
I think you just need to accept that AMD has been playing catch up with Nvidia for years in terms of features. AMD does copy/paste but can't match or beat Nvidias approach. Because they lack R&D funds and don't have dedicated cores for stuff like this. They won't match Nvidia with software alone.

Nvidia came up with Gsync. AMD came up with Freesync which was not their invention but simply uses standard VRR tech like Gsync Compatible monitors do. This is fine, but Gsync module still has advantages (higher refresh rate and less issues overall especially in windowed mode)
Nvidia came up with DLSS. AMD came up with FSR which is worse, often way worse. Not a single RTX owner will use FSR if DLSS is present, and it mostly is. We are close to 500+ games with DLSS/DLAA now.
Nvidia came up with DSR. AMD actually matched it most of the time with VSR, but then Nvidia released DLDSR which beats VSR and DSR while even bumping up performance alot.
and AMD has no answer to DLAA or DLSS 3.x - So far FSR3 looks to be another miss with simple interpolation.

Reflex... Shadowplay... CUDA... RT Performance... And I could go on... AMD can't match it. Hence the lower price. You are free to live in denial but features like Reflex, DLSS, DLAA and DLDSR + Good RT perf and tons of RTX mods and upcoming games like Half Life 2 RTX is the reason I went Nvidia again. There's tons of RTX modded games which completely changes how old games plays and look.

only issues i have are connected to nvidia proprietary software and 'wake up from sleep' in particular,

i had to set up scripts to set things manually,




I think this is a good point. Not everyone is capable of setting virtual machine with gpu passthrought.

personally i play dota 2 almost exclusively and it works fine.



i've used slackware before, now i'm using arch and right now i prefer linux desktop even on laptop. There are 2 engineering apps for PLC programming i run in virtual machines.

Big difference between "works fine" and getting full performance out of your hardware. I have tried gaming on Linux multiple times including SteamOS etc and it's been mostly a joke with tons of bugs, wonky performance and crashes left and right. I even have a SteamDeck collecting dust.

For me, running Linux as desktop is just more trouble and more problem solving meaning more downtime and less time for what I actually want to do. I don't see the point in mentioning Linux issues because you are asking for it yourself by running a sub 1% marketshare OS with zero focus and optimization from 99.9% of developers.

Linux for servers. Love it. My own debian server has like 4 years uptime right now.. For desktop use, nah..
 
Last edited:
That 4K is better than 720p. That by using deep learning algorithms, you can make a shit image a bit less shit.

Before you ask, it tells me absolutely nothing about buying a new monitor (especially since the 4K+DLSS/UP mode looks blurry as heck).

Edit: If "DLSS ultra performance" works with a 720p input image at 4K, then it's basically the same as "DLSS quality" at 1080p, which is... (you guessed it)... blurry.
It's blurry compared to native 1080p, but that's again a flawed comparison. 1440p + dlss q is the proper comparison to native 1080p and of course, the former is much better. I mean I can post you some comparisons but they won't convince you, you'll start making up stuff again
 
  • Like
Reactions: las
Sinking ship

Sinking ship, yes. Because the graphics cards shipments at all-time low and continue to decrease, with an outlook for this market segment to become niche or to stop existing all together.
You must throw the pink glasses out of the window and start seeing that TSMC 3nm process is one if not the last manufacturing process. Because after that physics strikes and you will not be able to manufacture newer chips.

Read, educate yourself.

Oof. Desktop GPU sales are down almost 40% year-to-year

However, in the same interview, Moore recognized that there are two basic physical obstacles that will eventually preclude any further miniaturization. As he recalled the cosmologist Stephen Hawking once pointing out on a visit to Silicon Valley, nothing can travel faster than the speed of light, while materials are, ultimately, made of atoms of a finite size. There are, in other words, speed and size limits to chips. “These are fundamentals I don’t see how we [will] ever get around,” Moore warned. “And in the next couple of generations, we’re right up against them.

:D
 
Sinking ship, yes. Because the graphics cards shipments at all-time low and continue to decrease, with an outlook for this market segment to become niche or to stop existing all together.
You must throw the pink glasses out of the window and start seeing that TSMC 3nm process is one if not the last manufacturing process. Because after that physics strikes and you will not be able to manufacture newer chips.

Read, educate yourself.

Oof. Desktop GPU sales are down almost 40% year-to-year

However, in the same interview, Moore recognized that there are two basic physical obstacles that will eventually preclude any further miniaturization. As he recalled the cosmologist Stephen Hawking once pointing out on a visit to Silicon Valley, nothing can travel faster than the speed of light, while materials are, ultimately, made of atoms of a finite size. There are, in other words, speed and size limits to chips. “These are fundamentals I don’t see how we [will] ever get around,” Moore warned. “And in the next couple of generations, we’re right up against them.

:D

AMDs GPU department will sink long before Nvidia's. That is for sure. Nvidia ships and sells way more GPUs than AMD. This is true for every generation. Maybe you should look at Steams Top 25 GPU List :roll:

Even the mediocre 4060 series will probably outsell entire Radeon 7000 series.. :laugh:

Oh Desktop GPU shipments goes down when GPU mining crashes... What a shocker :laugh: :laugh: LMAO...

TSMC already speaks about 2nm and Intel has 20A and 18A (2nm and 1.8nm) ready after Intel 4 which Meteor Lake will use. Arrow Lake is going to be a leap.

-> https://en.wikipedia.org/wiki/2_nm_process

Don't worry, they will find a way to improve performance and achitecture matters more anyway.
Nvidia beat AMD using cheap Samsung 8nm (more like 10nm) node when AMD used 7nm TSMC.
Nvidia destroyed AMD when going back to TSMC 4/5nm with 4090. 7900XTX is overall 25% slower in 4K gaming + lacing features and rt performance, all while Nvidia was not even trying and AMD tried their best (with a 999 dollar MSRP card that did not sell well - AMD mostly wanted to sell 7900XT instead)

AMD high-end GPUs simply don't sell well. This is why they leave out high-end RDNA4 GPUs -> https://overclock3d.net/news/gpu_di..._series_rdna_4_gpu_lineup_-_rumours_suggest/1
 
Last edited:
AMDs GPU department will sink long before Nvidia's. That is for sure. Nvidia ships and sells way more GPUs than AMD. This is true for every generation. Maybe you should look at Steams Top 25 GPU List :roll:

Even the mediocre 4060 series will probably outsell entire Radeon 7000 series.. :laugh:

Oh Desktop GPU shipments goes down when GPU mining crashes... What a shocker :laugh: :laugh: LMAO...

TSMC already speaks about 2nm and Intel has 20A and 18A (2nm and 1.8nm) ready after Intel 4 which Meteor Lake will use. Arrow Lake is going to be a leap.

-> https://en.wikipedia.org/wiki/2_nm_process

Don't worry, they will find a way to improve performance and achitecture matters more anyway.
Nvidia beat AMD using cheap Samsung 8nm (more like 10nm) node when AMD used 7nm TSMC.
Nvidia destroyed AMD when going back to TSMC 4/5nm with 4090. 7900XTX is overall 25% slower in 4K gaming + lacing features and rt performance, all while Nvidia was not even trying and AMD tried their best (with a 999 dollar MSRP card that did not sell well - AMD mostly wanted to sell 7900XT instead)

AMD high-end GPUs simply don't sell well. This is why they leave out high-end RDNA4 GPUs -> https://overclock3d.net/news/gpu_di..._series_rdna_4_gpu_lineup_-_rumours_suggest/1
Mining ended way before this year.

Pc gaming won't be viable if / when the next generation of pc gamer can't afford an entrance ticket.

Do you think Nvidia will reduce pricing or increase, next generation.

IMHO Arf is hyperbolic but there's some merit to some of his points.

Whytaf would Nvidia sell you a 5090 for less than 2k if that chips worth 4k or more to AI.

Do you think a 4060 @4/500£ is affordable to young gamers or they're hard working parents.

I don't and I worry that pc gaming is being priced out of viability.
 
I understand you deliberately wants to deny these features are awesome, because you can't use them. Once again, it's like seeing a 1080p SDR TV owner claiming that 4K HDR OLED is just a gimmick.

Hours of tweaking? :roll: Literally takes seconds.

AMD has no AA solution that comes close to DLAA.
AMD has no upscaling that comes close to DLSS.
AMD has no downsampling that comes close to DLDSR.

Facts for you. And every single person with proper RTX experience knows this is true. Or you can read the 1000's of tests that confirm it.

Techspot compared DLSS2 with FSR2 and FSR2 did not win in a single game. DLSS2 easily won in pretty much every one of them, about 26 different titles.

AMD don't even have a feature to counter DLAA or DLDSR, or DLSS 3 or 3.5 for that matter. Reflex, yeah no. You pay less to get less. If AMD was just as good or better, they would have higher marketshare.

When you had DLSS, DLDSR and DLAA for years, you simply don't accept what AMD is offering, even if you can save 100 bucks on average. Simply not worth it when resell value is lower for AMD hardware anyway (less demand and AMD lowers prices alot over a generation)
You've spent about 1,5 page now talking about numerous upscale techs while the topic is not entirely that. We get it, you love these techs, good on you. Others don't. Let's move on? This is also not a dick comparison between AMD and Nvidia, which you seem adamant to make it.

AMDs GPU department will sink long before Nvidia's. That is for sure. Nvidia ships and sells way more GPUs than AMD. This is true for every generation. Maybe you should look at Steams Top 25 GPU List :roll:

Even the mediocre 4060 series will probably outsell entire Radeon 7000 series.. :laugh:

Oh Desktop GPU shipments goes down when GPU mining crashes... What a shocker :laugh: :laugh: LMAO...

TSMC already speaks about 2nm and Intel has 20A and 18A (2nm and 1.8nm) ready after Intel 4 which Meteor Lake will use. Arrow Lake is going to be a leap.

-> https://en.wikipedia.org/wiki/2_nm_process

Don't worry, they will find a way to improve performance and achitecture matters more anyway.
Nvidia beat AMD using cheap Samsung 8nm (more like 10nm) node when AMD used 7nm TSMC.
Nvidia destroyed AMD when going back to TSMC 4/5nm with 4090. 7900XTX is overall 25% slower in 4K gaming + lacing features and rt performance, all while Nvidia was not even trying and AMD tried their best (with a 999 dollar MSRP card that did not sell well - AMD mostly wanted to sell 7900XT instead)

AMD high-end GPUs simply don't sell well. This is why they leave out high-end RDNA4 GPUs -> https://overclock3d.net/news/gpu_di..._series_rdna_4_gpu_lineup_-_rumours_suggest/1
Here's a little news flash

Ever since AMD owns the two consoles, they ship roughly equal or more gaming GPUs than Nvidia.
And in terms of revenue, consider the fact AMD doesn't have a service to match Geforce NOW either.
 
Last edited:
Big difference between "works fine" and getting full performance out of your hardware. I have tried gaming on Linux multiple times including SteamOS etc and it's been mostly a joke with tons of bugs, wonky performance and crashes left and right. I even have a SteamDeck collecting dust.

For me, running Linux as desktop is just more trouble and more problem solving meaning more downtime and less time for what I actually want to do. I don't see the point in mentioning Linux issues because you are asking for it yourself by running a sub 1% marketshare OS with zero focus and optimization from 99.9% of developers.

Linux for servers. Love it. My own debian server has like 4 years uptime right now.. For desktop use, nah..

all depends on the perspective,

for me what you wrote makes no sense - i feel bad running silly ms software which i can not customize and i have to install more silly software to make it usable.

wait a sec we are talking desktop experience? i don't even know what would i need to do to to customize my windows desktop the way i like and i'm running basic kde with minimal setup. I couldent move my task bar in windows.

i feel hardware is better optimized and you can see this by looking at tests or simply running a geekbench on linux. AMD and Intel pay attention because as you noticed all the servers and professional env work on linux, so the hardware support is amazing except the nvidia - nvidia is absolutely terrible. They released open source kernel module and left the driver for community to build? I'm gonna get rid of both this card and monitor
 
Servers and professional workstations have been running Unix/Linux variants for almost as long as I've been alive (which is a good while) without making any kind of meaningful inroads into the desktop world. No doubt there's a reason for that and no doubt, despite the perennial articles predicting the opposite, it's a state of affairs which will persist long after I've put on my wooden suit.
 
It's blurry compared to native 1080p, but that's again a flawed comparison. 1440p + dlss q is the proper comparison to native 1080p and of course, the former is much better. I mean I can post you some comparisons but they won't convince you, you'll start making up stuff again
Why is it a flawed comparison? If upping my resolution and using more aggressive upscaling results in a blurry image, then it's wasted money, isn't it? In my point of view, you're the one making stuff up by comparing 4K+DLSS to ancient resolutions that nobody uses anymore.

Mining ended way before this year.

Pc gaming won't be viable if / when the next generation of pc gamer can't afford an entrance ticket.

Do you think Nvidia will reduce pricing or increase, next generation.

IMHO Arf is hyperbolic but there's some merit to some of his points.

Whytaf would Nvidia sell you a 5090 for less than 2k if that chips worth 4k or more to AI.

Do you think a 4060 @4/500£ is affordable to young gamers or they're hard working parents.

I don't and I worry that pc gaming is being priced out of viability.
Well, at least the 7800 XT sold out on day one or two everywhere in the UK, so I have no fear of the PC gaming market. People have been saying it's a dying breed, but here we are, still discussing high-end cards and playing our games decades later.

The last couple of years were a test to see if people are willing to pay ridiculous prices for a GPU, and even with mining dead, the results are mixed at best.
 
Last edited:
Ever since AMD owns the two consoles, they ship roughly equal or more gaming GPUs than Nvidia.
And in terms of revenue, consider the fact AMD doesn't have a service to match Geforce NOW either.
I'm pretty sure that's not true. Switch on its own probably outsells both consoles and all desktop amd gpus

Why is it a flawed comparison? If upping my resolution and using more aggressive upscaling results in a blurry image, then it's wasted money, isn't it? In my point of view, you're the one making stuff up by comparing 4K+DLSS to ancient resolutions that nobody uses anymore.
Because you are comparing different internal resolutions. When you match those, dlss is just better.

I'm comparing 1440p dlss q to 1080p native. First looks much better while offering similar performance. Not really debatable, but you'll debate it none the less for whatever reason.
 
Because you are comparing different internal resolutions. When you match those, dlss is just better.
I don't care about the internal resolution. I care about what I see on screen. If my monitor does resolution X, that's the resolution I'm gonna be comparing different settings at.

I'm comparing 1440p dlss q to 1080p native. First looks much better while offering similar performance. Not really debatable, but you'll debate it none the less for whatever reason.
Go ahead then and compare them, and stop making assumptions about me.
 
A proper discussion requires that both parties debate in good faith, which isn't happening here so you may as well lay it to rest.
 
A proper discussion requires that both parties debate in good faith, which isn't happening here so you may as well lay it to rest.

It would always be a somewhat subjective thing and a user would have to use it a couple weeks and have both a 1440p and 1080p monitor. Also DLSS while very good in most games does occasionally have crap implementation and in that scenario you'd he stuck running native anyways which isn't for everyone especially those on strict budgets.

If I gamed at 1080p I'd definitely be using DLAA though or DLDSR in combination with DLSS although even at 1080p you'd likely need a 4070ti or above to do that anyways well beyond the hardware most people have.
 
It would always be a somewhat subjective thing and a user would have to use it a couple weeks and have both a 1440p and 1080p monitor.
That's the crux of it. There's not much comparison to do looking at screenshots on my 1080p screen, but that's the best I can do at the moment. Like I said earlier, I might be looking at a curved ultrawide at some point, I just don't want to risk not having enough performance to use it at native in every game and/or ending up not liking how upscaling looks on it (for which there is a chance considering that I definitely don't like upscaling at 1080p).
 
That's the crux of it. There's not much comparison to do looking at screenshots on my 1080p screen, but that's the best I can do at the moment. Like I said earlier, I might be looking at a curved ultrawide at some point, I just don't want to risk not having enough performance to use it at native in every game and/or ending up not liking how upscaling looks on it (for which there is a chance considering that I definitely don't like upscaling at 1080p).

I have 1080p/1440p/4k monitors so it's easy to compare but that's not realistic for most people. Never been a fan of ultrawide but those new oleds have me second guessing that.

My wife is making me get rid of a lot of stuff so I'm gonna have to downsize when moving into my new house lol.
 
60% of those steam users are probably on a laptop or / and on integrated graphics. They don't care about games cause they can't even run them.


With that said, and as I was telling you before regarding monitors, this is what you can do with DLSS.

99d5165acf0009eaaecba9d22c16613ff139468c24ffe491ae5d11b08ec56311.png
That clearly looks like "Post Processed" after zooming 400% to allow D.L.S.S to smooth & then captured.
 
I voted no even though I bought a 4090 like a chump. I wanted more vram, and didn't want to lose dldsr. Felt like that left me with really only two choices... 4080 or 4090. And the 4080 just feels like such a spit in the face I couldn't do it. So I had a few drinks, sucked it up, and bought a damn 4090.
 
I voted no even though I bought a 4090 like a chump. I wanted more vram, and didn't want to lose dldsr. Felt like that left me with really only two choices... 4080 or 4090. And the 4080 just feels like such a spit in the face I couldn't do it. So I had a few drinks, sucked it up, and bought a damn 4090.

I remember adding mine to cart and half hoping it would sell out before I got to checkout. :laugh:

Although i was a bit higher on the 4080 and was mildly tempted to grab the FE model but kept looking at 4k RT results in Cyberpunk to talk me out of it lol.
 
Status
Not open for further replies.
Back
Top