• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

So what does the Ada release by Nvidia mean to you

Has the Ada release by Nvidia been good in your opinion


  • Total voters
    101
Status
Not open for further replies.
Joined
Mar 10, 2010
Messages
11,880 (2.14/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Gskill Trident Z 3900cas18 32Gb in four sticks./16Gb/16GB
Video Card(s) Asus tuf RX7900XT /Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores laptop Timespy 6506
So as stated but with no pre set theoretical statement.

I want your opinions and keep them real, polite and none flamey!?!, I'll add mine after page 5.
 
Mostly a let down the technology is good but pricing has been atrocious and the 4-500 price range has stagnated.

The 4090 in games with heavy RT can be really impressive vs last generation and it's competitors but you shouldn't have to spend 1600+ to feel decent about a card I feel like every gpu below it comes with caveats. I do think most who own 4070/4070ti/4080s are more than enjoying their hardware though.
 
Last edited:
Not much really, didn't buy, waiting for next gen...
 
No, because except for the RTX 4090 which does belong to its corresponding performance/price tier, all the others are junk.
They are inflated by a tier up in performance and two-three pricing tiers up.

So, RTX 4060 Ti, is actually RTX 4050 Ti and its price must be slashed in half.
 
Eh, not much for me really.
As a budget-mid range user its not that appealing to me so I think I will skip the generation. 'my 3060 Ti is barely 1 year old in my system and its also enough for me atm/don't play that much nowadays'

4070 is the only one that I found somewhat interesting cause of the power efficiency-performance ratio and maybe affordable on the second hand market but I still don't think that its worth it for me.
 
It showed us that the 1 good 1 real bad rule is still true
 
Ada is great but prices were somewhat high on launch, same for AMD 7000 cards. Got my 4090 for cheap so did not matter to me.
Many current gen cards have been on sale several times + game bundles - alot of people picked up a cheap GPU lower than MSRP.

Mid-range cards were not a focus this gen. Why? Because Nvidia and AMD had huge stock after mining collapse. Why do you think AMD waited 1 year with the release of 7700XT/7800XT?

4070 is amazing for small builds and 4070 Ti delivers 3090 Ti performance at half the power. It is easy to see how bad 8nm (more like 10nm) Samsung is compared to 4/5nm TSMC. 3000 series were cheap due to going with Samsung. Obviously going TSMC will drive prices up. This is why AMDs CPU prices exploded with Ryzen 3000 and especially 5000 series and forward. Way more expensive than the cheap GloFo process. Nvidia were smart because Samsung could output tons of chips in comparison to TSMC at the time. RTX 3000 series outsold Radeon 6000 series with ease because of this.

Frame Generation is great in some games, Microsoft Flight Sim 2020 especially. Works wonders for heavily CPU bound games. Looks way smoother with DLSS3 enabled. Alot smoother actually, even on OLED.
People will always hate what they can't use ;) AMD users hate DLSS/DLAA and every single Nvidia exclusive feature, because they can't use it. RTX 3000 owners hate DLSS3 and FG because they can't use it. It is human nature.

My 4090 was a huge upgrade over my 3080 Ti.

A friend of mine picked up 4070 Ti for 649 dollars including Diablo 4, which he would have bought anyway, meaning he paid 579 dollars for his 4070 Ti. Really impressed with the performance after tweaking, it was a very big upgrade over his 3060 Ti and almost consumes same amount power.

You won't see new high-end GPUs before 2025+

AMD don't have focus on high-end GPUs (RDNA4 will have focus on low to mid end) and Nvidia are not forced to release anything (they have the fastest GPU and sales are good enough) and have focus on AI. Billions upon billions on AI market = Low gaming focus.

This is AMDs big oppotunity and they cancel high-end GPUs for RDNA4, LMAO... AMD don't even bother much with GPUs I think, they earn more per water on selling CPUs.

Most people that think RTX 4000 and Radeon 7000 series are bad, are hugging their old cards and tries to deny the fact that they are not high-end anymore :laugh: Decent, maybe, for some, not high end tho.
 
Last edited:
Ada Lovelace GPUs offer very limited VRAM bandwidth and amount, close to insanity high prices, questionable tiering, no nukes to nuke the Pluto...

I mean, RTX 4090 is a beast and it's worth every penny but everything lower than that has way more disadvantages than it has advantages.

That 12HVPWR thing is also a huge "why" but not because of the fact it's invented, no. I appreciate new things. What I do not approve of is making them inferior to old things:

• Very high wattage GPUs require connection as "straight" as possible, yet we couldn't see ATX 3.0 PSUs for a very long time after the release making it impossible to run higher end Adas without daisy chains which makes for a healthy risk of damaging the whole system by electric shorts.
• The point one also leads us to the fact it's just ugly when you can't plug your GPU directly to the PSU without any adaptors. "Cable management? Who dat?"
• It's incredibly stupid how nVidia has not made them foolproof. Sensors should've only fire up "OK" when it's plugged correctly, yet it wasn't a thing.
• Not to mention how nVidia blamed users for this nonsense. At the moment you're selling high complexity expensive products like GPUs you, in the perfect world, must make them usable for everyone, not only for those with some engineering degree and a brain.
• Doesn't directly make this decision bad but RTX 4080 and GPUs below could've had lower TDPs considering new node's efficiency. Losing a couple hundred MHz does not impact performance the way the GPU becomes a slow retarded piece of junk but another 50 to 80 W savings would be massive. "Era of single 8-pin *080 GPUs is back with Ada!" is a great slogan innit?

For me, it just shows us how unbothered nVidia is. AMD is so far behind Jensen feels comfortable to make highly overclocked 4050 cost $400 and be called a 4060 Ti. Why not? AMD didn't do anything to prevent this.

Quality: 4/10. Could be 8/10 but 12HVPWR wasn't that great.
Pricing: 3/10. No comment is needed here.
Longevity: 5/10. Do you really expect 500 Gbps VRAM to be decent at least? Games of tomorrow will need more. At 1080p. And 4070s are sold as 1440p ultimate gamers...
Power efficiency: 8/10. Could be straight 10/10 but nVidia decided on overclocking their GPUs stupid high, way beyond their sweet spots.
Feature set: 9/10. Multi monitor is more watt efficient than in AMD but it lacks flexibility AMD provides.

Overall: 5.8/10. Not good, not particularly bad. Just a "what have you expected from them?" kinda generation. I rate RDNA2 better and RDNA3 worse. Ampere? Despite being a direct insult it still was better than Ada all things considered.
 
My 12VHPWR connector has been working flawlessly from day one and still does, native Corsair connecter that plugged right in to my 4-5 year old PSU. Type 4 connector. Zero problems with it and looks way cleaner because of less cables. No way I would prefer 3 x 8 pin, or even 4 x 8 pin, instead of a single 12VHPWR connector. It is only a matter of time before AMD abandons the old solution too.

If you think 4090 is the only good card in 4000 series you are 100% clueless. You use 6700XT and call every RTX 4000 card bad? LMAO ... Come on :laugh:
4070 absolutely stomps on 6700XT while using 25 watts less. 4070 Ti destroys it completely.

Why do you care about VRAM bandwidth when performance is present? Performance is all that matters. 4070 Ti has 192 bit bus and easily beat 3080 Ti with 384 bit. It is pointless to look at raw numbers when performance is what matters to people. Different architectures. Also Ada have ALOT more cache than Ampere to make up for lower bandwidth + other improvements if you actually read about Ada design.

Not sure what you are talking about with power. Ada is way more power effcient than Ampere, which almost required undervolting (which you can still do with Ada if you want - I did just that with my 4090 offering insane performance per watt, while retaining stock performance or even increasing it slightly).

AMD had huge power spikes on 6800 and 6900 cards. They fixed this with 7000 series as well.


Look at the bottom. 6800XT and 6900XT spikes to 579 and 636 watts. This is what makes PSUs pop. In comparison 4070 Ti spikes at like 325 watts. 4080 spikes at 371 watts. 7900XTX spikes at 455 watts - Look up the newer reviews...

Do you think Nvidia has much focus on Gaming GPUs when they beat AMD years ago in this segment? AMD is years behind on features. If you only look at raster perf, like its 2015, then AMD might work, but personally I demand good drivers, good features and decent RT perf as well. Right now I am playing Starfield with DLSS mod, easily beating TAA/FSR2 IQ.

Im not sure I can ever go AMD again, unless they improve FSR like crazy (and match Nvidias other features). DLSS and DLAA is just highly superior and DLDSR is amazing as well. AMD can't match these features at all, and they don't even have a counter for DLAA and DLDSR. Techspot compared FSR2 with DLSS2 and FSR did not win in a single game, across 26 titles I believe.

So yeah, DLSS beats FSR. This is why people were mad Starfield did not have DLSS, luckily we got day one mod that works flawlessly.
 
Last edited:
impressive efficiency and technologies, awful naming, awful marketing, awful pricing, stagnation in every tier except of AD102.
imo it should be:
full 18432 Core AD102 = 4090 for 1299€
4090 -> 4080 Ti 999€
12288 core AD102 = 4080 for 849€
4080 -> 4070 699€
4070 -> 4060 479€
4060 Ti -> 4050 Ti 329€
4060 -> 4050 229€
 
Last edited:
My 12VHPWR connector
One case != whole picture.
You use 6700XT and call every RTX 4000 card bad?
Why does it matter? I am able to read and analyse articles. You don't need to have a gun to know it can shoot.
4070 absolutely stomps on 6700XT while using 25 watts less.
Hmmmm, $480 MSRP ($320ish now) GPU loses to a newer $600 GPU... Why is it weird..?
 
I like my 4070Ti, it’s awesome. Best card I’ve owned. A bit pricy, but so is a bag of chips.
 
They're powerful but Ngreedia just keeps increasing the prices generation after generation. IMO Pascal (and especially 1080 Ti) was their last good lineup what it comes to pricing.

Especially the 4060/Ti is a total joke.
 
I like my 4070Ti, it’s awesome. Best card I’ve owned. A bit pricy, but so is a bag of chips.
This isn't about a card.

It's about a series of,or range of cards so a more general opinion is desired rather than a one card view please.
 
Technically Ada is very solid. High perfromance, high performance per die area and per watt.
Marketing wise, horrible, most SKUs are way too cut down for their tier and, on top of that, the pricing is awful.

However, I'm not sure the next generation will be much different, Nvidia is preparing the market for the nm wall.
 
This isn't about a card.

It's about a series of,or range of cards so a more general opinion is desired rather than a one card view please.
4070 Ti is also a great example of their stupid naming in this generation. "4080 12GB" with significantly lower specs than the current 4080 was a total joke.

Also like many have said already, the 4060 series should be the 4050 series. When 4060 Ti loses to its older xx60 Ti brother, something isn't right.
 
6700XT was 479 USD on release.

6700XT is 2½ years old by now. Last gen card with little focus from AMD in terms of drivers and optimizations. EOL. No one cares about the price today. It barely does 1440p with 60+ fps in newer games on high settings. This is why the comparison is relevant.

A friend of mine bought 6700XT with Starfield, but returned it when he saw how terrible the game ran with 6700XT. 6800XT was and is a way better card. Luckily he got a full refund while keeping the game and bought 4070 instead. He is way more happy about that card.

6700XT with its 12GB VRAM gets beaten by 3070 8GB, even in 4K minimum fps, and the price difference on launch was 20 dollars.

So much for more VRAM :laugh:

Proof using newest drivers for all cards -> https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html

However none of the cards are suited for 4K gaming. Just shows that VRAM and bandwidth does not matter when performance actually exist.

If your 6700XT had 512 bit bus and 64GB VRAM it would still loose to 4070, why? Because the chip itself is weak. Actual performance is what sells.
 
Last edited:
but Ngreedia just keeps increasing the prices
It was literally hundreds less than the cheapest 7900XT..

Though those have come down now..
 
Why do you care about VRAM bandwidth when performance is present?
Higher bandwidth RDNA3 GPUs handle 4K way better than these gimped out nonsense "4070s." Stutterfest VS non-impressive, yet way more flat frametime curve. 4070 competes with 7800 XT at 1080p, feels good at 1440p and falls considerably behind at 4K. This translates to 7800 XT will still be a decent 1440p gamer when new generations come and 4070 will be ultimately deported to 1080p tier.
Not sure what you are talking about with power.
Ada at 2.5 GHz consumes about 20 to 30% less power than at its actual 2.8 GHz even without undervolting. This is 10% slower, yet all GPUs of this line-up are already bottlenecked by their almost non-existent VRAM bandwidth so you'll only notice half of this loss. 300ish 4080 would've been just above 200 W if it had been launched at its sweet spot core clock. This would've completely destroyed AMD.
Ada is way more power effcient than Ampere
This is not an achievement. Not news either. Ampere is very atroicous in terms of perf/watt, being even worse than RDNA2 in that regard.
AMD had huge power spikes on 6800 and 6900 cards. They fixed this with 7000 series as well.
Not news either.
Do you think Nvidia has much focus on Gaming GPUs when they beat AMD years ago in this segment? AMD is years behind on features.
Of course not. You're just copying what I said with different wording.
personally I demand good drivers, good features and decent RT perf as well.
I also do demand good drivers, good features and decent RT. AMD drivers are good. Features are... not that good. Especially in terms of launch dates. FSR3 should've broken out a YEAR ago but it's still a promise and not a released product. RT performance is almost on par with Ada of the same price so I don't know what you're talking about. EVERY card is atrocious at RT with AMD being a bit worse in this department.
DLSS beats FSR.
Yet another not news news.
Last gen card with little focus from AMD in terms of drivers.
And it's still superior to RTX 4060 series. xD
No one cares about the price today.
You != everyone.
 
You seem to be an AMD fanboy rather than have a meaning about Ada vs Ampere :laugh: Nothing about Ampere was better, except maybe price on launch, however Ampere was releasd during mining boom and prices were insane for 90% of the generation. Pointless to talk about MSRP here, when many people paid 1000+ dollars for a 3080.

RDNA2 had good efficiency? Check out the power spikes above. Also multi monitor power draw is insane.

4060 series is crap and your 6700XT slots right in between 4060 and 4060 Ti. Not really an archievement if you ask me...

4070 series and up is way better than 4060 series. Like I said, Nvidia had little to no focus on lower end GPUs this generation because of GPU flooding in this segment. Logic 101. 3060 Ti sells for the same price as 6700XT where I am while performance is similar + has DLSS/DLAA/Reflex/Shadowplay and better RT perf + more and better features in general.

4070, 4070 Ti, 4080 and 4090 are all great cards. Personally I only buy upper mid or high end, meaning 4070 Ti and up. 4070 Ti is 20% faster than 7800XT in 1440p, meaning 25% faster than 6800XT/3080. While using 250 watts on average. 4080 was and is the most overpriced one, however the card has some of the best performance per watt, beating every single card from AMD.

AMD waited way too long with the release of 7700XT/7800XT because they wanted to sell out their huge 6000 series inventory which was collecting dust after mining crash.
 
Last edited:
Thats crazy talk. I play at 3840x2160 just fine.

True. A friend of mine use 4070 Ti for 4K/UHD just fine.

4070 Ti performs better than RTX 3090/24GB in 4K -> https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/32.html
Also beats 6900XT/16GB by 15% in 4K...

Yet it is unplayable and nonsense? :roll:

If you lack performance, simply enable DLSS2 and gain 50-75% performance, or more than 100% sometimes with DLSS3 ... DLSS also serves as good anti aliasing solution, best in class (DLAA), unlike FSR (read TPUs Starfield DLSS review). I love DLSS/DLAA/DLDSR for single player games. I tend to skip it for fast paced multiplayer games. They are rarely demanding tho.

4070 Ti is only 5 fps slower than 7900XT in 4K gaming in terms of minimum fps -> https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
 
Last edited:
You seem to be an AMD fanboy
And you seem to deliberately refuse to read what I wrote and instead argue with things I've never told. I am by no mean an AMD fan and I prefer nVidia at any time of a day but it's debilitating how much more expensive nVidia GPUs are where I live. I'm left with no choice other than getting said 6700 XT for what I'd have paid for a 3050. I'd run RTX 4090 if I had a proper job, it's not even to doubt.

The only thing Ampere is physically better is real VRAM bandwidth. But yeah, with so much slower cores it doesn't really matter.
And Ampere provided us with a very reasonable tier/perf scaling, yet it's only Ada with such an issue. AMD's fault.
Thats crazy talk. I play at 3840x2160 just fine.
I know it's fine. But similarly priced AMD GPUs do the job better, unless RT is enabled.
RDNA2 had good efficiency?
Only compared to Ampere and older tech. Compared to Ada, it sucks balls. Power spikes are not the whole picture and are stupid easy to avoid.
Also multi monitor power draw is insane.
Now it's just bad. A year back, it was even worse than just insane. Still hope for more improvements despite never needing more than one display at once.
6700XT with its 12GB VRAM gets beaten by 3070 8GB
Yes. I know 6700 XT was a very terrible product for what AMD asked for it. And 4060 are even worse. I only bought 6700 XT because in my particular case all those 3060/3070s were about 50% more expensive. I don't regret going AMD. I regret not going for RX 6800.
 
No I am just tired reading RTX 4000 series is bad when it is not true at all. Mostly from people with RTX 3000 or Radeon 6000 series. Even 4070 Ti beats every single card from last generation when you look at the overall performance across many games, like what TPU is doing. Especially in terms of performance per watt. 4070 Ti beats 3090 and 6900XT/6950XT and pretty much performs like a 3090 Ti, at 250 watts on average in gaming. Without even undervolting and some people paid 1999 dollars for 3090 Ti like 9 months before 4070 Ti came out.

Also tired of people speaking about 12VHPWR issues when the issue affected a few people globally which did not put in the connector correctly. Or used some faulty adaptors. I had zero issues with it and used a quality 12VHPWR cable, natively from the PSU. Actually I love the new connector. No way I am going back to 2, 3 or 4 times 8 pin instead of a single connector.

Nvidia and AMD mostly focused on cards that beats last generation cards, which means 4070 Ti and 7900XT and up. They could have left out 4060 series and low tier Radeon 7000 cards, but they will take over when last gen cards are sold out. Remember they are EOL and gone when stock is sold out. This is exactly why AMD first released 7700XT/7800XT now. Nvidia did not bother to wait because they sell way more cards in general.

6700XT can be good enough for some, I don't disagree but alot of the new cards are great. Only 4060 series seems kinda bad in the 4000 series IMO, all the other cards are meaningful and 99% of people don't need more than 12GB for years to come.

4070 performs like a 3080 with DLSS3 and 2GB more VRAM + more cache, at much lower watts. This kind of performance is plenty for many people, easily maxing out most games at 1440p. Price is maybe 100 dollars too high but meh, not many cares about 100 dollars in the end for a GPU they will use for some years. Good effiency will matter more. Buying a cheaper 3080 will cost more in terms of electricity over the years and eat up the savings anyway.
 
Last edited:
Status
Not open for further replies.
Back
Top