• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

So what does the Ada release by Nvidia mean to you

Has the Ada release by Nvidia been good in your opinion


  • Total voters
    101
Status
Not open for further replies.
I have 1080p/1440p/4k monitors so it's easy to compare but that's not realistic for most people. Never been a fan of ultrawide but those new oleds have me second guessing that.

My wife is making me get rid of a lot of stuff so I'm gonna have to downsize when moving into my new house lol.
I never really wanted an ultrawide myself, but like I said, swapping out a perfectly good monitor only for the higher resolution feels like a total waste. :ohwell:

If I upgrade, I might as well make it worth the investment in some other areas as well.
 
I never really wanted an ultrawide myself, but like I said, swapping out a perfectly good monitor only for the higher resolution feels like a total waste. :ohwell:

If I upgrade, I might as well make it worth the investment in some other areas as well.


People love to argue about graphics cards and cpus but honestly the most transformative upgrade can be a monitor especially something with proper HDR Oled/Mini led.... That is a whole other topic and money sink though so everyone like with anything has to decide what's best for them.
 
I let my wallet talk, and avoided credit :D

Or else I would regret looking at my 4090 every time I looked towards my computer :D

Heck I would still be using my 3070 Ti except my boys 980 was getting tired.

I should have got a 4090. I will wait till next round :)
 
I let my wallet talk, and avoided credit :D

Or else I would regret looking at my 4090 every time I looked towards my computer :D

Heck I would still be using my 3070 Ti except my boys 980 was getting tired.

I should have got a 4090. I will wait till next round :)

I know a lot of people would call it blasphemy but my 2080ti in my secondary pc was getting pretty tired and needed to be retired it was starting to struggle at 1440p a bit even with DLSS. The 3080ti was pretty transformative for my Wife and gave me an excuse to get the 4090 lol..... Poor me.
 
I actually crashed Spider Man on my kids PC at just 1080p.. tried running max settings. Booo.. wasnt expecting that :D

Sooo... maybe 8GB isnt enough :laugh:

My youngest can have the 3070 Ti, my oldest will get my 4070 Ti, and I will get a flagship 50 series if it is called that.

My last flagship was a GTX580.. so its time :)
 
I have the 4080 and am more than pleased. No issues for almost a year now.
 
  • Like
Reactions: N/A
People love to argue about graphics cards and cpus but honestly the most transformative upgrade can be a monitor especially something with proper HDR Oled/Mini led.... That is a whole other topic and money sink though so everyone like with anything has to decide what's best for them.

Moving to 1440UW was one of the two best changes I've made, in terms of gaming. The other was moving to OLED 1440UW. The difference is truly night and day and the added enjoyment is nearly immeasurable.
 
Moving to 1440UW was one of the two best changes I've made, in terms of gaming. The other was moving to OLED 1440UW. The difference is truly night and day and the added enjoyment is nearly immeasurable.

I've primarily been using a 48 C1 and a 65 G2 and they're pretty amazing for 4k gaming.
 
Moving to 1440UW was one of the two best changes I've made, in terms of gaming. The other was moving to OLED 1440UW. The difference is truly night and day and the added enjoyment is nearly immeasurable.
I'm scared of OLED, with all the rumours around burn-in of desktop and UI elements. A friend of mine has OLED in his phone and the youtube logo has burnt into his screen pretty badly. UW is tempting, though.
 
I went from a cheapo 10+year old Samsung 23" 1080p to a cheapo Walmart special LG 50" UHD TV.. both are 60Hz, but I prefer my TV :D

Buuut I am not sitting at a desk these days :toast:
 
32:9 is uber!
The MSi Pro MP341CQ is relatively affordable (although it's 21:9). I'm tempted to surprise myself with one for Christmas. I'm just a bit reluctant for the above mentioned reasons (even with the 7800 XT). 32:9 screens are way above my price range.
 
All I know is that if FSR3 Frame Gen doesn't impress me or match DLSS3 next year, I am probably going to go with Nvidia RTX 5090 or RTX 4080 Ti for my next build in a few years. Happy where I am for now, but yeah. DLSS3 is pretty impressive stuff.
 
  • Like
Reactions: N/A
Ada? I was expecting more for the price. It's mostly about pricing, which started back in the days of Turing, but then again all goods these days jacked up in price.
Ada is my first foray with nvidia's DLSS tech(came from GTX 1070) and i can immediately see how superior it is to FSR, heck even Xess looks better than FSR.
I was supposed to get the RTX 3070 3 years ago in combination with the new zen3 platform, but was disappointed by its positioning, skyrocketing of prices due to pandemic and scalpers and its 8GB vram, looking at the AMD side, its no better and the one bothered me more is even today, RDNA2 GPUs on my region is very expensive compared to nvidia, a 6800XT here still costs almost double here vs a 4070, even the newly arrived 7800XT costs a lot more than a 4070 despite supposed to being a cheaper value.
Feels like Pascal generation days and earlier are the best times
 
Last edited:
The MSi Pro MP341CQ is relatively affordable (although it's 21:9). I'm tempted to surprise myself with one for Christmas. I'm just a bit reluctant for the above mentioned reasons (even with the 7800 XT). 32:9 screens are way above my price range.
I came from triple 1080 surround/eyefinity background, but over the years got tired of the little issues from green/red. Thus I was like F it, yolo'd into a G9 and it's been a gorgeous experience.
 
I'm scared of OLED, with all the rumours around burn-in of desktop and UI elements. A friend of mine has OLED in his phone and the youtube logo has burnt into his screen pretty badly. UW is tempting, though.

No bother for me, as the monitor has a 5 year warranty which includes burn-in. If it develops, it'll certainly be here by then. After 6 months or so there's not a trace to be seen.

32:9 is too wide for me though. I've tried one, but no thanks I like 21:9 better. Maybe because I sit relatively close to the screen. My dream, however, would be a 2:1 screen, say 4000x2000, as ideally I think 1440UW is a bit lacking in height. No big deal though; I'm well pleased as it is.
 
Last edited:
Ada? I was expecting more for the price. It's mostly about pricing, which started back in the days of Turing, but then again all goods these days jacked up in price.
Ada is my first foray with nvidia's DLSS tech(came from GTX 1070) and i can immediately see how superior it is to FSR, heck even Xess looks better than FSR.
I was supposed to get the RTX 3070 3 years ago in combination with the new zen3 platform, but was disappointed by its positioning, skyrocketing of prices due to pandemic and scalpers and its 8GB vram, looking at the AMD side, its no better and the one bothered me more is even today, RDNA2 GPUs on my region is very expensive compared to nvidia, a 6800XT here still costs almost double here vs a 4070, even the newly arrived 7800XT costs a lot more than a 4070 despite supposed to being a cheaper value.
Feels like Pascal generation days and earlier are the best times
I had a similar problem where I live. Back when AMD reduced its prices on the 6800xt and 6950xt in response to the 4070 and what not, I could not find those deals in my country... at least at the time when I was gpu shopping, which is why I bought a 4070, ran some benchmarks and returned it. As it was performing just barely better than my 3070. Only 1 fps better in heaven. I know benchmarks aren't everything but still it left a really bad taste in my mouth - so I returned it for the 4090.

Sorry AMD and Intel I would have loved to support you, but I need good downsampling, and only nvidia offers that right now. Internationally competitive prices would have really helped too. Nvidia cards always seem to be the equivalent to US msrp, but I can't say the same with amd and intel, at least, not when I was l looking.
 
I had a similar problem where I live. Back when AMD reduced its prices on the 6800xt and 6950xt in response to the 4070 and what not, I could not find those deals in my country... at least at the time when I was gpu shopping, which is why I bought a 4070, ran some benchmarks and returned it. As it was performing just barely better than my 3070. Only 1 fps better in heaven. I know benchmarks aren't everything but still it left a really bad taste in my mouth - so I returned it for the 4090.

Sorry AMD and Intel I would have loved to support you, but I need good downsampling, and only nvidia offers that right now. Internationally competitive prices would have really helped too. Nvidia cards always seem to be the equivalent to US msrp, but I can't say the same with amd and intel, at least, not when I was l looking.
It's mostly about supplier issue on our region, years ago, local suppliers carries a lot of radeon brands like sapphire and powercolor aside from the multi like gigabyte or MSI thus they are closer to the msrp, but these days i don't think they get em on local suppliers thus its much more expensive.
On a regular shop here, you will see this, dominated by nvidia
1694664760859.png


7800XT prices (756 to 880 USD converted), i can imagine these will still go down but considering radeon GPUs are low in supply here, im not expecting much
eleb5Uc.jpeg
 
Internationally competitive prices would have really helped too. Nvidia cards always seem to be the equivalent to US msrp, but I can't say the same with amd and intel, at least, not when I was l looking.

Exactly. At launch the 4090 was priced exactly as expected (i.e. far too pricey, but I line with US prices), while the 7900XTX was at least $200 extra. That sort of thing doesn't generate big sales.
 
It's a mixed bag generation that's for sure.

Honestly at a technical level I think it's bloody excellent, for what physical specs each card packs, they're delivering a lot of perf and a massive uplift vs 30 series spec equivalents.

The issue is how many most of the products, have the wrong name for their performance and specs, and then the wrong price too, although the names wouldn't be such a bad thing if the prices were right, imo.

So yeah mixed, the cards themselves are highly impressive, there's just none I want to buy for what they offer for the money, same for RNDA3 really.

Also of course funny to see DLSS come up again, and be massively discredited... by the folks who don't use it. :roll: Why do you think when I created a DLSS IQ thread I limited it to owners, and even further than that owners that are currently using it with their main card day to day, because otherwise we just keep hearing how it sucks and won't stick around from the people who are actively invested in that future coming true, and can't / don't benefit from it in the here and now.

I've heard all the arguments against it multiple times, and in saying this I'm virtually guaranteed to hear some repeated yet again (as they already have been earlier) by the same demographic I mentioned.... yet still, myself many many others like it, prefer it, and want it in games, I don't think it's going anywhere in the PC space anytime soon, it's clearly still on the upswing, and clearly enough of a selling point to sway potential buyers facing 'on the fence' choices.

honestly the most transformative upgrade can be a monitor especially something with proper HDR Oled/Mini led
QFT, the LG 42 C2 is the single biggest upgrade I've done in the last 10 years, even plugging in 1080p consoles for example, the monitor itself makes everything so much more vibrant and clear in motion.
 
Also of course funny to see DLSS come up again, and be massively discredited... by the folks who don't use it. :roll: Why do you think when I created a DLSS IQ thread I limited it to owners, and even further than that owners that are currently using it with their main card day to day, because otherwise we just keep hearing how it sucks and won't stick around from the people who are actively invested in that future coming true, and can't / don't benefit from it in the here and now.
I said this in your thread as well: People who use something on a daily basis probably do so because they like that thing, don't they? And if you try something and you don't like it, you won't use it again, right? If you're looking for diverse, honest opinions, you shouldn't limit the target of your question to people who use and like the thing in question. Otherwise, you may as well start a poll with only one possible answer.

"Do you like broccoli? Please only answer this question if you eat broccoli every day." See what I mean? ;)
 
I said this in your thread as well: People who use something on a daily basis probably do so because they like that thing, don't they? And if you try something and you don't like it, you won't use it again, right? If you're looking for diverse, honest opinions, you shouldn't limit the target of your question to people who use and like the thing in question. Otherwise, you may as well start a poll with only one possible answer.

"Do you like broccoli? Please only answer this question if you eat broccoli every day." See what I mean? ;)
Well for starters, I disagree, and I find your example way off-base. Broccoli isn't changing, there aren't significant upgrades between versions, this analogy has fallen totally flat for me.

DLSS is precisely the kind of thing you could and should return to (if you can), as it has improved and evolved significantly since its inception, and quite a bit still since when I recall you last used it on RTX hardware yourself, especially at lower input resolutions (ie, I vehemently disagree that it's unusable/trash at 1080p, I use it a lot currently at 1080p using my A2000, and find that assertion to be opposite to my experience), plus results change per game and per DLL. The mod in the thread agreed at the time and had good points;

I'd kindly ask that people who aren't currently using up-to-date drivers on RTX cards capable of DLSS shouldn't be posting. Tech moves along and we can all agree that the first implementations of DLSS weren't great. But it has improved so arguing over older experiences is tantamount to trolling. I'll also add that DLSS is game dependent, which makes it harder to argue for as a global thing.

If someone has an RTX card, that's in active use in their rig, and do try it periodically for instance and want to share that they hate it, and find it a useless gimmick, I'm all ears, even if I disagree. I just happen to have some criteria in which someone's opinion on XYZ-matter might have more or less weight/merit, so this is where I stand for tech like DLSS, and I'm yet to hear an augment compelling enough to make me reconsider that, especially when the results of grouping the different opinions are fairly evident and reinforce my rationale. This isn't to say people that don't use it can't have an opinion, and for sure all of yours shared are totally valid in your own context/s and some opinions even have some good merits to them, nobody can take that away from you, but I can absolutely weigh those how I see fit.

I didn't mean to derail the thread with this, it's somewhat relevant to Ada (possible selling point for GPU buyers), but I don't expect you or certain others to change their mind, just as I likely won't from people on forums who don't use it telling me how much of a gimmick/useless/not-a-standard it is.
 
1080p native 4k dlss balanced. Same internal resolution - similar framerate

1080p native
image-2023-09-14-092328432.png




4k dlss b

image-2023-09-14-092512200.png
 
all depends on the perspective,

for me what you wrote makes no sense - i feel bad running silly ms software which i can not customize and i have to install more silly software to make it usable.

wait a sec we are talking desktop experience? i don't even know what would i need to do to to customize my windows desktop the way i like and i'm running basic kde with minimal setup. I couldent move my task bar in windows.

i feel hardware is better optimized and you can see this by looking at tests or simply running a geekbench on linux. AMD and Intel pay attention because as you noticed all the servers and professional env work on linux, so the hardware support is amazing except the nvidia - nvidia is absolutely terrible. They released open source kernel module and left the driver for community to build? I'm gonna get rid of both this card and monitor
Yeah Nvidia is absolutely terrible for Linux, yet owns the AI market with servers that runs on Linux, and use distributed computing frameworks like CUDA and TensorFlow to train the models - Lets get real now.

If your Nvidia exprience is bad on Linux, use a proper distro or set it up right.

Geekbench is horrible for comparison across different operating systems, nothing new here. A few workloads might here and then perform better on Linux yes, but if you look at overall performance - especially gaming - performance is way worse on Linux. Some are hit, 99% of games are miss with zero optimization and focus from dev's.

You've spent about 1,5 page now talking about numerous upscale techs while the topic is not entirely that. We get it, you love these techs, good on you. Others don't.
"Others" being AMD users :laugh: no wonder they don't like DLSS/DLAA/DLDSR or Reflex when they are stuck with inferior features.
You AMD users can keep denying facts all you want, does not change reality. AMD is nowhere near Nvidia in terms of features, hence the lower price. Logic 101.
 
Last edited:
Well for starters, I disagree, and I find your example way off-base. Broccoli isn't changing, there aren't significant upgrades between versions, this analogy has fallen totally flat for me.

DLSS is precisely the kind of thing you could and should return to (if you can), as it has improved and evolved significantly since its inception, and quite a bit still since when I recall you last used it on RTX hardware yourself, especially at lower input resolutions (ie, I vehemently disagree that it's unusable/trash at 1080p, I use it a lot currently at 1080p using my A2000, and find that assertion to be opposite to my experience), plus results change per game and per DLL. The mod in the thread agreed at the time and had good points;



If someone has an RTX card, that's in active use in their rig, and do try it periodically for instance and want to share that they hate it, and find it a useless gimmick, I'm all ears, even if I disagree. I just happen to have some criteria in which someone's opinion on XYZ-matter might have more or less weight/merit, so this is where I stand for tech like DLSS, and I'm yet to hear an augment compelling enough to make me reconsider that, especially when the results of grouping the different opinions are fairly evident and reinforce my rationale. This isn't to say people that don't use it can't have an opinion, and for sure all of yours shared are totally valid in your own context/s and some opinions even have some good merits to them, nobody can take that away from you, but I can absolutely weigh those how I see fit.

I didn't mean to derail the thread with this, it's somewhat relevant to Ada (possible selling point for GPU buyers), but I don't expect you or certain others to change their mind, just as I likely won't from people on forums who don't use it telling me how much of a gimmick/useless/not-a-standard it is.
Nobody is telling you that DLSS is a gimmick. I only said that I consider it that, as scrapping tensor cores, then using the freed-up die real estate for more raster and/or RT hardware could result in better native resolution performance and less need for upscaling to begin with. I also said that I didn't like it much when I played Cyberpunk 2077 with DLSS Q at 1080p. You are absolutely free to differ from this opinion and share your own experiences - that's what an online forum is for after all. But let me not digress from the main topic any more. :)

1080p native 4k dlss balanced. Same internal resolution - similar framerate

1080p native
image-2023-09-14-092328432.png




4k dlss b

image-2023-09-14-092512200.png
These screenshots look just about the same to me. But then, the questions that follow are 1. How different is your experience in game, and 2. Do you really want a 1080p-like experience on a 4K screen?
 
Status
Not open for further replies.
Back
Top