• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

So what does the Ada release by Nvidia mean to you

Has the Ada release by Nvidia been good in your opinion


  • Total voters
    101
Status
Not open for further replies.
Lots of flag waving not so much conversation.
 
And that's a problem because...?
It's a problem because instead of performance bumps, we get more fake resolutions and fake frames with generational upgrades. You're having to degrade your image quality more and more, unless you own a flagship card. DLSS is being used as a lame excuse to sell overpriced, underperforming GPUs, and for game devs to skip optimisations. That's the problem.

No, it is not. Reviewers that tested them say otherwise. Let's stop with the misinformation campaign.

View attachment 313296
Give me more isolated, cherry-picked examples, please. :D We all know that frame queue limiting is a hit-and-miss depending on the game.

And paying 549€ anno 2023 for a card with the RT performance of a 3 year old 500€ 3070 is an absolute waste in my opinion, but as per the usual, YMMV. You know, not everyone has the same needs and wants
Yep, you do you, we can agree on that. In my opinion, RT is still just a gimmick. Maybe in 5-10 years, we can revisit this conversation, and I might be on a different opinion then.

Why? 16GB made no difference, not even in 4K gaming -> https://www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/31.html
Shows just how irellevant VRAM is, when GPU is the limiting factor.

A slow GPU won't need more than 8GB, even in 2023. 4060 series are considered slow GPUs. Meant for 1080p gaming mostly. Can do 1440p in some games tho, on lower presets. Not meant to max out demanding games at 1440p at all. Meaning 8GB is plenty.

3070 8GB very easily beats 6700XT 12GB in 4K gaming FYI -> https://www.techpowerup.com/review/asus-radeon-rx-7700-xt-tuf/35.html
So where is the magic of more VRAM here? Slow GPU will be slower and zero of those cards will be suited for 4K on high preset anyway.

You ramble about VRAM, meanwhile most look at performance numbers. The only games that caused 8GB cards problems, were a few rushed console ports like TLOU and RE4 and they ran fine after a patch or two, even on 8GB.
Like I said, in current games. If you're planning to keep the card for 3-5 years, however, you might be screwed. The funny thing is, you won't know until the next blockbuster comes out and runs like crap.
 
Ada doesn't mean much to me. My graphics card is 3 generations behind, and the rest of my system even more so... all my money goes to other places now and I've still not run into crippling performance issues doing what I do with my rig. /shrug
 
It's a problem because instead of performance bumps, we get more fake resolutions and fake frames with generational upgrades. You're having to degrade your image quality more and more, unless you own a flagship card. DLSS is being used as a lame excuse to sell overpriced, underperforming GPUs, and for game devs to skip optimisations. That's the problem.
That's your opinion, which is irrelevant first and foremost cause you don't have an nvidia card to test your claims. Most if not every review disagrees with you, DLSS not only improves your framerate but also improves image quality. Period.

Give me more isolated, cherry-picked examples, please. :D We all know that frame queue limiting is a hit-and-miss depending on the game.
Ok, give me a link to your non cherry picked tests that show that antilag = reflex. Nobody ever has claimed anything like that. Reflex is NOT a frame queue limiter.

At this point I think you are just lying. You can't be serious...
 
I have no idea what "here in the EU means", but here in the UK, the cheapest 4070 is 90 quid more expensive than the cheapest 7800 XT. That's 18%! I don't care what gimmicks features Nvidia is trying to push down my throat, I'm not paying that much more for identical performance. I wouldn't even compare to the 4060 Ti, as it is an entirely different performance segment altogether, yet, the 16 GB version costs as much as the 7800 XT does.

Like I said above, the only card I'd consider buying over its direct competitor is the 4060, because that's the performance level that needs DLSS and FG the most.
Gimmick features, like best-in-class AA and upscaling plus superior DLDSR which AMD has no answer for? Haha :laugh: AMD is cheaper for a reason. Why do you think? No Reflex, no DLSS, no DLAA, no DLDSR, no Shadowplay just pure rasterization performance focus with inferior features and slow RT perf. This is why AMD is priced lower than Nvidia. You get what you pay for. FSR is a joke compared to DLSS/DLAA. This is probably why AMD users speak about native all the time, because FSR is their only experience with upscaling and it is mediocre at best compared to DLSS. And games with DLSS have option for DLAA as well, best AA method today. AMD users are stuck with TAA which sucks in most games.

UK got fcked back when they left EU, prices went south and you went poor. Look at caseking.de and you will see that 7800XT is priced within 50 euro of 4070. Same in denmark. norway, sweden, finland and pretty much every country in ACTUAL Europe. No-one here cares about UK pricing these days. Stuff is way too expensive because of taxes. You can thank Boris, if he's awake.

Nothing about 7800XT is impressive and the GPU is too weak to even utilize 16GB anyway. It performs just like 6800XT which you could have bought for the same price and even cheaper many times.

Like I said, in current games. If you're planning to keep the card for 3-5 years, however, you might be screwed. The funny thing is, you won't know until the next blockbuster comes out and runs like crap.
In 5 years you will be screwed regardless if you plan to max out games. That is the funny thing. VRAM never saved a weak GPU and 7800XT is mid-end at best. Also, AMD GPU owners won't have DLSS to save them if they ACTUALLY plan to keep their GPU for 4-5-6 years.

7800XT don't even max out games today in 4K, in 5 years it will be slow as dogpoo. It will barely play games on medium in 1440p probably. And you will have no DLSS to help you and has to use inferior FSR.

Seriously, don't try to futureproof when it comes to PC hardware. Seen this too many times. Any GPU will run like crap in 5-6 years because AMD and Nvidia have full focus on newest generation and the one before that afterwards. They don't spend time on GPUs 3-4 generations old and you will see tons of issues in new games. Zero optimization present and developers don't care either, because they use somewhat new GPUs.

Yes more VRAM is great, when the actual GPU is great. This is not the case with 7800XT. It delivers nothing we did not have already. 6800XT is 3 years old and have same performance, it was even cheaper than 7800XT is now. Only reason AMD waited almost half a generation before realeasing 7800XT was because 6800XT filled the same slot and sold better.
 
Last edited:
That's your opinion, which is irrelevant first and foremost cause you don't have an nvidia card to test your claims. Most if not every review disagrees with you, DLSS not only improves your framerate but also improves image quality. Period.
As a matter of fact fact, I do have a 2070 that I played Cyberpunk 2077 on. 1080p native looked the best, DLSS Quality was borderline acceptable (I finished the game with this setting), and Balanced and below were utter trash. So I do know what I'm talking about, believe it or not.

Ok, give me a link to your non cherry picked tests that show that antilag = reflex. Nobody ever has claimed anything like that. Reflex is NOT a frame queue limiter.

At this point I think you are just lying. You can't be serious...
Yeah, I'm lying. Just like I don't have an Nvidia card, I don't have an AMD one, either, so I have no clue what Anti-Lag actually is. Come on, man... :roll:

I don't need any review to tell me about quality settings that cannot be conveyed through a Youtube video by any means. That's what I've got my own games for.

Gimmick features, like best-in-class AA and upscaling plus superior DLDSR which AMD has no answer for? Haha :laugh: AMD is cheaper for a reason. Why do you think? No Reflex, no DLSS, no DLAA, no DLDSR, no Shadowplay just pure rasterization performance focus with inferior features and slow RT perf. This is why AMD is priced lower than Nvidia. You get what you pay for. FSR is a joke compared to DLSS/DLAA.
Reflex = sorta Anti-Lag, DLSS = FSR, Shadowplay = ReLive, DLDSR = sorta VSR. I'll give you DLAA, but I don't really know any game that supports it (or is it a driver feature?)

Your average AMD card is only lacking "features" if you're addicted to throwing your favourite Nvidia buzzwords around.

UK got fcked back when they left EU, prices went south and you went poor. Look at caseking.de and you will see that 7800XT is priced within 50 euro of 4070. Same in denmark. norway, sweden, finland and pretty much every country in ACTUAL Europe. No-one here cares about UK pricing these days.
We got fucked with Nvidia prices, but not AMD? How does that work? :kookoo:

Nothing about 7800XT is impressive and the GPU is too weak to even utilize 16GB anyway. It performs just like 6800XT which you could have bought for the same price and even cheaper many times.
It's not cheaper here.

In 5 years you will be screwed regardless if you plan to max out games. That is the funny thing. VRAM never saved a weak GPU.
We'll see about that. If I wasn't as curious about new tech as I am, I'd probably be gaming on my 2070 still. Besides, does 2/4 GB 960 ring a bell? Or 3/6 GB 1060? People said the same back then.
 
As a matter of fact fact, I do have a 2070 that I played Cyberpunk 2077 on. 1080p native looked the best, DLSS Quality was borderline acceptable (I finished the game with this setting), and Balanced and below were utter trash. So I do know what I'm talking about, believe it or not.


Yeah, I'm lying. Just like I don't have an Nvidia card, I don't have an AMD one, either, so I have no clue what Anti-Lag actually is. Come on, man... :roll:

I don't need any review to tell me about quality settings that cannot be conveyed through a Youtube video by any means. That's what I've got my own games for.


Reflex = sorta Anti-Lag, DLSS = FSR, Shadowplay = ReLive, DLDSR = sorta VSR. I'll give you DLAA, but I don't really know any game that supports it (or is it a driver feature?)

Your average AMD card is only lacking "features" if you're addicted to throwing your favourite Nvidia buzzwords around.


We got fucked with Nvidia prices, but not AMD? How does that work? :kookoo:


It's not cheaper here.


We'll see about that. If I wasn't as curious about new tech as I am, I'd probably be gaming on my 2070 still.
I asked you for a test showing that Reflex = antilag. You haven't provided any, the only thing you provided is your opinion. Testing shows that they are not the same thing. Actually testing shows that FG + reflex has the same latency as antilag without FG. That's all that needs to be said on the matter.

If you don't care about input latency, sure , whatever, it's fine, I got no issue with that, but let's not pretend these 2 are the same cause they just aren't.
 
What would happen in an alternate timeline, where Raja went to work for Nvidia instead of AMD and Intel?
 
I asked you for a test showing that Reflex = antilag. You haven't provided any, the only thing you provided is your opinion. Testing shows that they are not the same thing. Actually testing shows that FG + reflex has the same latency as antilag without FG. That's all that needs to be said on the matter.

If you don't care about input latency, sure , whatever, it's fine, I got no issue with that, but let's not pretend these 2 are the same cause they just aren't.
Fine (Google is your friend).
 
Sure, but what about scrapping the Tensor cores, and using the now freed up die real estate for more traditional shaders and/or RT cores? Then we wouldn't need DLSS in the first place, and could have decent enough performance for native resolutions.

DLSS is like buying a car with square wheels, then redesigning the world's road infrastructure to fit cars with square wheels - a solution to a problem that shouldn't even exist.

I wouldn't want that and I think most actually buying nvidia hardware wouldn't also. I liike DLSS and again the image quality it offers ar 4k I often like better than Native the 30-50% performance bump is just a bonus.

I mean that's literally the whole reason I purchased a 4070 over a 7800XT despite the 100 gulf in price if they removed the tensor cores and added rasterization performance/RT cores it would just be an AMD card with slightly better RT no thanks.
 
Fine (Google is your friend).
That's not a comparison? He is just testing reflex...

I wouldn't want that and I think most actually buying nvidia hardware wouldn't also. I liike DLSS and again the image quality it offers ar 4k I often like better than Native the 30-50% performance bump is just a bonus.
I'd trade DLSS for around 30-40% extra raster performance. I
 
That's not a comparison? He is just testing reflex...
There's a brief description of both as latency-reducing technologies. I'm not a developer, and I'm too tired after work to deal with the details. The rest can be googled.
 
As a matter of fact fact, I do have a 2070 that I played Cyberpunk 2077 on. 1080p native looked the best, DLSS Quality was borderline acceptable (I finished the game with this setting), and Balanced and below were utter trash. So I do know what I'm talking about, believe it or not.


Yeah, I'm lying. Just like I don't have an Nvidia card, I don't have an AMD one, either, so I have no clue what Anti-Lag actually is. Come on, man... :roll:

I don't need any review to tell me about quality settings that cannot be conveyed through a Youtube video by any means. That's what I've got my own games for.


Reflex = sorta Anti-Lag, DLSS = FSR, Shadowplay = ReLive, DLDSR = sorta VSR. I'll give you DLAA, but I don't really know any game that supports it (or is it a driver feature?)

Your average AMD card is only lacking "features" if you're addicted to throwing your favourite Nvidia buzzwords around.


We got fucked with Nvidia prices, but not AMD? How does that work? :kookoo:


It's not cheaper here.


We'll see about that. If I wasn't as curious about new tech as I am, I'd probably be gaming on my 2070 still. Besides, does 2/4 GB 960 ring a bell? Or 3/6 GB 1060? People said the same back then.

Dude, ReLive is a joke compared to ShadowPlay. This is why 99.9% of streamers use Nvidia. SP has native support on Twitch and all big streaming sites, meaning pretty much zero performance loss when streaming with better quality than ReLive can do.

VSR is not DLDSR. VSR is DSR and no RTX user uses DSR anymore. DLDSR is superior and replaced it.

Reflex is superior to what AMD has it has been tested numerous times.

Every single feature you mention, Nvidia came up with first. AMD then copied it and result was worse in every case. AMD don't have R&D funds to spend on stuff like this. They spend most of those funds on CPU and APU design. Their GPU sector is barely profitable.

AMD does not have a single feature that is on par or better than what Nvidia offers. FSR3 is going to be inferior to DLSS3 as well and DLSS 3.5 came out recently. AMD is always a few years behind on features really. Sure if you want raster performance only, like its 2013, then AMD GPU will do just fine, in some games (just stick to the popular ones, because AMD GPU gets zero optimization in lesser popular games and early access titles)
 
That's not a comparison? He is just testing reflex...


I'd trade DLSS for around 30-40% extra raster performance. I

I would also Just doubt removing the tensor cores would add that maybe 10% max but I'm not hardware engineer. It's especially an issue on like a 4060ti that really needs more raster vs last generation.

I also like DLDSR and DLAA thought not sure I'd want to give those up.
 
Last edited:
Well, this thread is a complete shit show.

Anyhow, to answer the OP. Ada sucks for the very obvious, having to spend $1600+ on ANY GPU under ANY circumstances is utter ignorance.

Nvidia has made it crystal clear they have no interest in anything other than squeezing every last thin dime out of gamers (unless your name is AI)for quite some time. I choose not to feed the pig.
 
Dude, ReLive is a joke compared to ShadowPlay. This is why 99.9% of streamers use Nvidia. SP has native support on Twitch and all big streaming sites, meaning pretty much zero performance loss when streaming with better quality than ReLive can do.

VSR is not DLDSR. VSR is DSR and no RTX user uses DSR anymore. DLDSR is highly superior.

Reflex is superior to what AMD has it has been tested numerous times.

Every single feature you mention, Nvidia came up with first. AMD then copied it and result was worse in every case. AMD don't have R&D funds to spend on stuff like this. They spend most of those funds on CPU and APU design. Their GPU sector is barely profitable.
We're still talking about gimmicks in my opinion. Besides, I haven't noticed any difference between Shadowplay and ReLive (tested on a 5700 XT ages ago), or DLSS and FSR. Let me leave it at that.

I would also Just doubt removing the tensor cores would add that maybe 10% max but I'm hardware engineer. It's especially an issue on like a 4060ti that really needs more raster vs last generation.
4060 (Ti)-level cards have extremely small GPUs, so whether you remove Tensor cores or not, they could do with a size and raw performance bump anyhow.
 
4060 (Ti)-level cards have extremely small GPUs, so whether you remove Tensor cores or not, they could do with a size and raw performance bump anyhow.

I meant to say not a hardware engineer lol.... need to get some coffee :laugh:
 
We're still talking about gimmicks in my opinion. Besides, I haven't noticed any difference between Shadowplay and ReLive (tested on a 5700 XT ages ago), or DLSS and FSR. Let me leave it at that.

Of course you think it is gimmicks, when you can't actually use any of those features :laugh:

If you actually read up on comparisons, you will see that Nvidia features easily beats AMD features. This is why AMD GPUs are cheaper. AMD tries to copy/paste Nvidia features but falls short every single time. Not enough R&D funds and low focus from AMD. Their main focus is CPUs and APUs.

People vote with their wallets and Nvidia sits at 80-85% gaming GPU marketshare. If AMD actually was better, people would be buying alot more AMD cards. But they don't and AMD actually lost marketshare year on year for the last 5+ years in terms of GPU marketshare.

Nvidia GPUs are slightly more expensive, yet resell value is also higher, makes zero difference in the end. AMD hardware losses value faster than Nvidia and Intel, because AMD always competes on price and lowers it over time, meaning your card will be close to unsellable after a few years.
 
Last edited:
I also like DLDSR and DLAA thought not sure I'd want to give those up.
Exactly, that's what 30-40% raster performance would give you. You could use ahigher resolution and downscale to 4k (just like DLDSR) without a performance hit
 
Of course you think it is gimmicks, when you can't actually use any of those features. :laugh:
Did I not just say that I also have a 2070 sitting on a shelf that I was actively using for about 2 years, including for Cyberpunk 2077? Why am I bothering with this conversation if you can't be asked to read my replies?
 
  • Haha
Reactions: ixi
Exactly, that's what 30-40% raster performance would give you. You could use ahigher resolution and downscale to 4k (just like DLDSR) without a performance hit

I doubt a 40% improvements to raster would allow DLDSR which renders an image similar to 4x the resolution.
 
I doubt a 40% improvements to raster would allow DLDSR which renders an image similar to 4x the resolution.
Those are nvidias numbers. They are claiming DLDSR 2.25x offers the same quality as DSR 4x. I doubt that's true though, sounds pretty wild. You could use 1.5x DSR with 40% more raster performance. But im with you, DLSS and DLAA are great.
 
So I left my opinion out of the OP so that this thread did not become a me v the world defending my opinion thread.
Rather a thread to reflect on the release in general.

My opinion is they're are No bad GPU cards only bad price's.

And besides the 4090 which I think is a great piece of tech, just out of my price range.
The rest were priced poorly relatively to Nvidia's prior releases.

For me this isn't a us v them thread, though I get the cross references are sometimes on point, but it's Ada that should be debated.

For the meh brigade, I mentally count that as a -1 because the release clearly isn't good to you, you just didn't want to call out Nvidia, or truly didn't care, so no answer was enough there.
 
Status
Not open for further replies.
Back
Top