• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

So what does the Ada release by Nvidia mean to you

Has the Ada release by Nvidia been good in your opinion


  • Total voters
    101
Status
Not open for further replies.
Those are nvidias numbers. They are claiming DLDSR 2.25x offers the same quality as DSR 4x. I doubt that's true though, sounds pretty wild. You could use 1.5x DSR with 40% more raster performance. But im with you, DLSS and DLAA are great.

As for as I can tell it's similar I've extensively used both on Witcher 3 prior to the next generation patch.
 
Did I not just say that I also have a 2070 sitting on a shelf that I was actively using for about 2 years, including for Cyberpunk 2077? Why am I bothering with this conversation if you can't be asked to read my replies?
2070 is like 6 years old and you played Cyberpunk 2077, one of the worst optimized and most demanding games ever released in 2021 (or very late dec 2020) with a mid-end card from 2018? This is what you base your experience with Nvidia features on? Come on, 2070 did not even beat 1080 Ti in most games. Most features I speak about were not even present back then... DLSS 1 sucked just as much as FSR 1, DLSS 2 and forward is where the magic happens and DLDSR came out in early 2022.

So what did you use before your 7800XT? Did you buy 2070 in 2021? :laugh: You play in 1080p eh? DLDSR is magic for 1080p and 1440p, but 1080p especially. 1080p looks very bad without downsampling.
 
Last edited:
Nothing wrong with a 2070 Super, I just got alot of free bonus fps when upgraded from i7 6700K to i7 12700K and still using my 2070 Super with 1440p 165Hz monitor.
It's still good enough for the games I play, I probably wait for GTA 6 and then look for a new GPU or a RTX 5xxx GPU.
 
Nothing wrong with a 2070 Super, I just got alot of free bonus fps when upgraded from i7 6700K to i7 12700K and still using my 2070 Super with 1440p 165Hz monitor.
It's still good enough for the games I play, I probably wait for GTA 6 and then look for a new GPU or a RTX 5xxx GPU.
Nothing wrong with a 4060 Ti either, if it suits your needs ... Not even 6800XT and 3080 would be enough for me these days and they are both day and night faster than 2070.

However 2070 Super is 12-15% faster than 2070 which was the card he mentioned.

If you don't have any issues at 1440p/165Hz with a 2070, which games are you actually playing? Because some games will absolutely punish a 2070 if you actually take full advantage of that monitor (meaning 120-165 fps goal), unless you play at low settings and goes 100% CPU bound

Also, your CPU was outdated, no wonder you got a big performance boost, 6C/12T has been minimum for years, atleast in most newer demanding AAA games

Crysis 3 from 2013 scaled way above 4 cores and ran best with 6+ cores, huge performance drop using only a quad core chip and this is 10 years ago.
 
Last edited:
2070 is like 6 years old and you played Cyberpunk 2077, one of the worst optimized and most demanding games ever released in 2021 (or very late dec 2020) with a mid-end card from 2018? This is what you base your experience with Nvidia features on? Come on, 2070 did not even beat 1080 Ti in most games. Most features I speak about were not even present back then... DLSS 1 sucked just as much as FSR 1, DLSS 2 and forward is where the magic happens and DLDSR came out in early 2022.

So what did you use before your 7800XT? Did you buy 2070 in 2021? :laugh: You play in 1080p eh? DLDSR is magic for 1080p and 1440p, but 1080p especially. 1080p looks very bad without downsampling.
I played CP with the DLSS 2 update, so yeah... DLSS is crap at 1080p. It gives you some extra frames when you really need them... like when you're playing on a 2070, for example. ;) Upscaling is an aid to get the performance that you don't have. Nothing more. That's my opinion and it's final. I've had this argument a million times over and I don't intend to do it again.

Nothing wrong with a 4060 Ti either, if it suits your needs ...
Just the price. And that's why I think the entire 40-series sucks.
 
which games are you actually playing?

Mostly GTA series, Forza Horizon4/5, Monster Jam, Wreckfest, Dirt series, GRID series and such.
 
Last edited:
got a 3090 so 40 series is a definately skip for me...
 
What would happen in an alternate timeline, where Raja went to work for Nvidia instead of AMD and Intel?
We might get another Vega :D
 
For the meh brigade, I mentally count that as a -1 because the release clearly isn't good to you, you just didn't want to call out Nvidia, or truly didn't care, so no answer was enough there.
That's up to future card releases to decide, and not you at the moment though ?
Also, calling out NV for being... a capitalist company (ie. making more and more money for investors), is meant to do what exactly ?
If they think selling cards for higher price is better for "bottom line", they will do it (or at least I expect them to).
Also also, You as GPU "customer" (ie. person that wants to buy a GPU), always have a choice :
Do you buy it, or not ?
If you do, that confirms price for it is right (for you specifically).
If you don't... (NV : well, there is always someone that will thinks otherwise)
Sure, if enough customers will think price is too high, demand will drop.
But will NV care enough about it ? (I'd say, if margins and profits are high enough on other stuff - chances are they won't)

PS. Some people have other priorities than buying latest GPUs though, so don't assume we all care about current prices/performance/features. Some may care about used stuff, and have ample patience to wait for at least few years before getting actual new card (since current one is still "good enough", like me :P and yes I bought it used).
 
Last edited:
That's up to future card releases to decide, and not you at the moment though ?
Also, calling out NV for being... a capitalist company (ie. making more and more money for investors), is meant to do what exactly ?
If they think selling cards for higher price is better for "bottom line", they will do it (or at least I expect them to).
Also also, You as GPU "customer" (ie. person that wants to buy a GPU), always have a choice :
Do you buy it, or not ?
If you do, that confirms price for it is right (for you specifically).
If you don't... well, there is always someone that will thinks otherwise.
Sure, if enough customers will think price is too high, demand will drop.
But will NV care enough ? (I'd say, if margins and profits are high enough on other stuff - chances are they won't)

PS. Some people have other priorities than buying latest GPUs though, so don't assume we all care about current prices/performance/features.
Oh I'm no judge, but I am allowed an opinion and saved it a while specifically not to cloud others or direct the conversation, you all do you, as I often say.

But also imho

Only Intel is selling reasonable priced GPU, IMHO.

I will begrudgingly pay up sometimes as a tech enthusiast and I do want to game but on a 3/5 year upgrade plan your experience and choices are always going to be different from a yearly or two year updater.

I only like to buy into a doubling of performance personally, minimum hence the long upgrade gap's.
So for Me personally Nvidia sold too little Vram per segment at too high a price, except the top end obviously , that's my opinion, your ok to think differently you may.

But ymmmv and that's fine
 
Is it subjective to claim that 4k is superior to 320p in image quality? Do I need to use a tool to objectively measure that?

I can prove that in technical terms Yes it can be "subjectively written" to claim 4K is superior to 320P especially when you take something that was recorded in 320P & try to stretch it to 4K. Then claim its image quality is better when in fact is not.
The point is There is a factual way to compare data. & there is a opinion & feeling way to compare something.
90% of reviewers have only done the later of opinion based for all these upscalers by only claiming the visuals are superior to "native or other upscaling tech.
 
I can prove that in technical terms Yes it can be "subjectively written" to claim 4K is superior to 320P especially when you take something that was recorded in 320P & try to stretch it to 4K. Then claim its image quality is better when in fact is not.
The point is There is a factual way to compare data. & there is a opinion & feeling way to compare something.
90% of reviewers have only done the later of opinion based for all these upscalers by only claiming the visuals are superior to "native or other upscaling tech.
Are you saying 4k isn't objectively better than 320p? Ok
 
Are you saying 4k isn't objectively better than 320p? Ok

I think what he is trying to say is people say 4k upscaled from 1440p via DLSS can be better than native but it is not.
 
5 years ago at Nvidia:
"Look, boss, the new Tensor cores are awesome for machine learning."
"Oh, really!"
"And now they're part of our new architecture all across the board!"
"Cool! But how do we sell this to gamers?"
"Oops... err... lemme think... dunno... don't worry, we'll come up with something."

Thus, DLSS was born.
Sadly I think you are spot on.

Don't get me wrong, I hate it too but we can't outbid big data. We're stuck.
 
Sadly I think you are spot on.

Don't get me wrong, I hate it too but we can't outbid big data. We're stuck.

Well when it's been successful to the point both Intel and AMD are trying to compete with it and people are willing to pay higher prices for their gpus partly because of it, it probably was a good bet.

I do hope a universal upscaling that is equal to or better than DLSS is developed but until then I don't see this changing.

Even Frame Generation has gone down way better than I expected and it's a feature I don't really care for.
 
Well when it's been successful to the point both Intel and AMD are trying to compete with it and people are willing to pay higher prices for their gpus partly because of it, it probably was a good bet.
It totally was a good bet but not for that reason. Simply because they get to rebrand what would be a useless datacenter feature as a gaming one and suffer (almost) no bad PR for it.

That it works half way decently is a bonus for everyone involved, naturally.
 
Tensor cores are more useful for watching those old 480p videos as far as I'm concerned, DLSS can go extinct for all I care. Pretty sure it will have more impact on videos/streaming & there's arguably a much bigger market there!
 
It totally was a good bet but not for that reason. Simply because they get to rebrand what would be a useless datacenter feature as a gaming one and suffer (almost) no bad PR for it.

That it works half way decently is a bonus for everyone involved, naturally.
Exactly this. If Nvidia wants to sell you a shiny turd, 9 out of 10 people will buy it, and AMD and Intel will have to follow suit to stay competitive.
 
Tensor cores are more useful for watching those old 480p videos as far as I'm concerned, DLSS can go extinct for all I care. Pretty sure it will have more impact on videos/streaming & there's arguably a much bigger market there!

The one major difference is when using it with games in my use case a lot of times power consumption goes down where as when I use it for streaming power consumption goes up 2-3x so for me not very useful. I haven't tried it with streaming in a while to see if that's changed though

Exactly this. If Nvidia wants to sell you a shiny turd, 9 out of 10 people will buy it, and AMD and Intel will have to follow suit to stay competitive.

That's the reality we live in though Nvidia’s $h!+ is worth more than AMD's $h!+.

Bottom line though both these companies are just going to try and sell their turds for as much as people are willing to pay Nvidia has a massive advantage from the professional market though so if gamers don't want their cards someone else will.

DF had a funny interaction with someone from amd I guess they were asked what they think the 7800XT should cost and they replied $499 and the amd rep sounded insulted by them saying that. Comically it ended up that price too bad they didn't ask what the 7700XT should cost.
 
Last edited:
Did you mean 7700xt? Such a massive opportunity lost to bring the famed 7970 x/xt/xtx/x3d/hx3d back & so many more combinations :shadedshu:

Funnily enough so many gamers are probably superstitious wrt odd(?) numbers, what the heck :slap:
 
Did you mean 7700xt? Such a massive opportunity lost to bring the famed 7970x/xt/xtx/x3d/hx3d back & so many more combinations :shadedshu:

Funnily enough so many gamers are probably superstitious with odd(?) numbers, what the heck :slap:

Yeah, I kinda think it was a missed opportunity to call them the 7850 and 7870....
 
Initially I was pretty meh on Ada. Seeing how each core was massively cut down from the 4090, the mem bandwidth, all of that made me look at AMD. Like for realz.

But thinking back to how people were like 8GB is not enough! and remembering my experience with my 3070 Ti.. and I thought it was a good card. Not the best of course.

But running my 4070 Ti, and watching people say how shitty these are, it just makes me chuckle. 12GB is not enough! Lol.. Ok.. arm chair reviews dont count :)

If you want the best, buy the best.. thats what its there for :)
 
Initially I was pretty meh on Ada. Seeing how each core was massively cut down from the 4090, the mem bandwidth, all of that made me look at AMD. Like for realz.

But thinking back to how people were like 8GB is not enough! and remembering my experience with my 3070 Ti.. and I thought it was a good card. Not the best of course.

But running my 4070 Ti, and watching people say how shitty these are, it just makes me chuckle. 12GB is not enough! Lol.. Ok.. arm chair reviews dont count :)


If you want the best, buy the best.. thats what its there for :)


For me it's always been about a given gpus price what it should offer in the vram department but that really comes down to the buyer and what type of games they play I definitely wouldn't buy a card I thought was inferior just becuase it has more vram

Although it is meant for 1080p I did buy a 12GB 4070 over a 16GB 7800XT for my brother and a 12GB 6700XT for a for funs AMD build becuase it's cheap. So in general I have no issues with 12GB.
 
Last edited:
Yeah maybe I could have worded that a bit differently.. next time :D
 
Yeah maybe I could have worded that a bit differently.. next time :D

I might sound harsh towards a specific gpu becuase my use case doesn't align with it or becuase I think it is overpriced for what it offers but I would never question someone else's purchase even if that was grabbing a 4060ti 16GB over a 7800XT 16GB I might think it's questionable but for their use case it might be the right choice same with people buying a 7900XT over a 4070ti.

I might jokingly give them a hard time though which on the internet is usually interpreted as hostile but it is what it is. :laugh:
 
Status
Not open for further replies.
Back
Top