• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

So what does the Ada release by Nvidia mean to you

Has the Ada release by Nvidia been good in your opinion


  • Total voters
    101
Status
Not open for further replies.
I very much doubt it's that easy, or even close. Things are rarely, if ever, that clear cut in the real world.

It is a bad thing because it lowers your image quality and promotes developer laziness.

Just like fevgatos above I very much beg to differ. You need look no further than to Starfield for an example.
 
Yeah, them too, no doubt barring a few fanbois.
 
Do people not understand there is factual way to compare image quality" via overlayed pictures?

Here is what Bing A.I said on how do so.
Yes, it is possible to overlay frames and show errors in picture differences between each. One way to do this is by using OpenCV and Python 12. You can use the Structural Similarity Index (SSIM) to compare two images and determine if they are identical or have differences due to slight image manipulations, compression artifacts, or purposeful tampering 1. To visualize the differences between images using OpenCV and Python, you can draw bounding boxes around regions in the two input images that differ 1. You can also use the cv2.subtract() function to compute the difference between two images and color the mask red.

The difference between objective and subjective is that objective information is based on facts, while subjective information is based on feelings or opinions.

Currently it seems most reviewers have only put out an Subjective review of any type of these technologies that increase the fps in game.
 
While it may not be scientific or fully objective, anyone can view any number of side-by-side comparisons on YouTube. And while fairly un-scientific due to YouTube compression et al, that goes equally for both sides. For my own part I only needed to play the same games again with my 4090 as I had played with my 6800 XT. That settled the matter rather nicely.
 
Just like fevgatos above I very much beg to differ. You need look no further than to Starfield for an example.
I play at 1080p. Any kind of upscaling looks like shit at 1080p.

While it may not be scientific or fully objective, anyone can view any number of side-by-side comparisons on YouTube. And while fairly un-scientific due to YouTube compression et al, that goes equally for both sides. For my own part I only needed to play the same games again with my 4090 as I had played with my 6800 XT. That settled the matter rather nicely.
I don't believe in youtube. I believe in playing games and seeing for myself.
 
  • Like
Reactions: ixi
And while fairly un-scientific due to YouTube compression et al, that goes equally for both sides.
Huh, what? You do know that (regular)4k compressed in YT will look appreciably worse than 1440p or 1080p uploads. Also not sure if there's an enhanced bitrate version available for 4k but even at 1080p (IQ) it's not that much better.

Worse as in the quality drops a lot more, just in case it wasn't clear.
 
Do people not understand there is factual way to compare image quality" via overlayed pictures?

Here is what Bing A.I said on how do so.




Currently it seems most reviewers have only put out an Subjective review of any type of these technologies that increase the fps in game.
Is it subjective to claim that 4k is superior to 320p in image quality? Do I need to use a tool to objectively measure that?
 
Any kind of upscaling looks like shit at 1080p.
Turn VSR on, change your resolution to 4K, play with FSR: Quality. Try believing your eyes.

No need to thank me.
 
Turn VSR on, change your resolution to 4K, play with FSR: Quality. Try believing your eyes.

No need to thank me.
I can't do VSR because switching resolutions messes up the window positions on my secondary screen.
 
Peeps complain about Ada pricing but things were so bad w/ampere and the cryptocurrency mining shortage that even the pricing for the 4090 looks cheap by comparison.
This + inflation. Even only going back 5 years it’s kind of insane

IMG_4653.jpeg
 
I would love to build a desktop but I'm no way in hell going to send a week's paycheck on a 4090 to get good performance vs last generation. That money goes into my retirement account so I can retire at 55. A 4080 are good but still expensive nothing under that are still expensive, the 4060ti isnt any better than a 3060ti.
Make that 2-3 months where I live. :laugh: 'if we are talking about a brand new card, 4090 starts around 2000 $ here'
Tho the entire product stack is overpriced so theres that.
 
It is really for 4K. But the RTX 4070 Ti is a wrong card, with wrong specifications and wrong market position.
The reason for the good performance upgrade is the change from Samsung's 8nm to TSMC's 4nm.

And already memory starved in certain titles:

View attachment 313120

View attachment 313119

Memory starved LMAO. Educate yourself :laugh:
RE4 and TLOU1 had massive VRAM issues and has been fixed in drivers. Even 8GB cards run just fine now.

Cyberpunk 2077 can't even run in Overdrive mode on my 4090. Good example :laugh:

Btw heard of RAM allocation? Many game engines uses all VRAM available.
There's not a single game that has VRAM issues with 12GB in 4K.

One of the best looking games recently released, Atomic Heart, uses 7.5GB maxed out in 4K -> https://www.techpowerup.com/review/atomic-heart-benchmark-test-performance-analysis/5.html

Lets have a look at 4060 Ti 8GB vs 16GB in 4K gaming -> https://www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/31.html

Pretty much identical performance, even in terms of minimum fps. VRAM don't matter when GPU is the weak link. You are not maxing out games anyway, when GPU is weak, meaning that VRAM won't matter because you won't be running high settings.

Logic 101.
 
Last edited:
The release got screwed over by them taking a cheaper route with the memory setup, and relying on DLSS to make it work.

GDDR6 came out with 4GB per chip (16Gb), and at higher speeds than the older, smaller modules.
That means you can now get higher VRAM capacity with half the complexity... but with half the bus width.

Energy efficiency: Sure, this came out great.
1694501339140.png

But did it come out great because it's good, or because the last gen were over-volted to the moon (Yes), and because they cut back on the RAM bus size massively due to moving to higher capacity VRAM modules? (Also yes)

The 3090Ti used a ton more power than the 3090 but was also more efficient
1694502130115.png

1694501507410.png

snipped image from here
But it's more efficient? How, asks the voice in my head?
1694501556060.png


Why little voicey... because they used half as many memory modules, so more of that power went to the GPU and not the VRAM.
The difference on the Ti is that each memory chip has twice the capacity, so there's only half as many required, which means no more memory chips on the back side of the card
During the eth mining boom, my 3090 would hit its 375W limit with the GPU clocked at 1.4GHz - super far down vs the 2.1GHz it can boost to when the ram is hardly in use.
Undervolting the GPU let me reach 1.6GHz at 370W, so it's easy to imagine the 3090Ti's higher capacity VRAM would have let it clock even higher with no other changes.



These new GPU's have the same deal
3060TI: 256 bit, 8x GDDR6 modules
4060Ti: 128 bit bus, x4 GDDR6 modules

It's almost like halving the amount of VRAM freed up more wattage, which massively boosts efficiency - but harms performance in bandwidth intensive situations.



At lower resolutions these changes don't impact them, and they're pretty good.
1694501942788.png


But at 4k, it's worse than what it replaced. All VRAM heavy titles will behave this way, and the gap will be bigger if you aren't running top of the line systems like TPU's review rig, where it's got stupid fast DDR5-6000 CL36 and PCI-E 5.0 - you need every nanosecond of lower latency to feed it new data, since it doesnt have the raw bandwidth to get the data across faster.
1694501982003.png

But who cares right, DLSS saves the day by rendering at a lower resolution.



Edited in, didn't even notice an 8GB variant existed. I cared so little about these cards.
The 8GB and 16GB both use the same amount of GDDR6 modules - just 4x2GB or 4x4GB modules.
The power consumption barely changed between them, because of this.

The 3060Ti needs a hecking lot more power - but ~15W from that ram going to the GPU would make a world of difference
1694502445954.png


Throw in the smaller gains from it being a newer GPU design even if it was just a refresh, a less complex PCB due to less wiring to VRAM, cheaper VRMs and all those things SHOULD have added up to a much cheaper card that was good value for money... and it wasnt. If you gamed at higher resolutions, you were paying more for less.
 
Last edited:
Not worth the jump from previous generation, not that huge leap. 4060 losing to 3060 ti in raw power. 3070 ti beats 4070... 4xxx series is good only with DLSS except two most powerful cards from 4xxx. Have used that few times and I saw difference (in lower quality) than all the advertising etc. With 8GB mega duper nvidia high vram size have ran into wall with few games already (Far Cry 6, Harry Potter, Resident Sleeper) when fps goes from above 60 to slide show :love:. It aint the VRAM, it aint the VRAM goddamit.

Main point of dissapointment is that nvidia pushes useless DLSS in games and makes this DLSS groudbreaking innovation and trying to cash in from people who believes everything they have read, saw and told.

Sad that I got 3060 ti in mining BUM and 6700 xt at that moment costed almost x2 of 3060 ti that time which was already overpriced cr@p.



Short answer - Meh. Next gpu will be from RED.
3070 Ti don't beat 4070 haha. 4070 performs like a 3080, at ~200 watts on average, which is 125 watts less than 3080 and has more VRAM and way more cache. All of 4000 series have alot more cache than Ampere.

3070 Ti even uses like 300 watts because of GDDR6X and delivers pretty much nothing over the regular 3070 with GDDR6 which consumes like 75 watts less because bandwidth was never a problem for 1440p gaming which these cards are meant for. The only reason we got 3070 Ti was because GDDR6 supply was running low and GDDR6X was in abundance.

Useless DLSS? hahah... I bet you are using AMD right now because DLSS/DLAA/DLDSR is superior to what AMD have and DLSS/DLAA offers insane good antialiasing that easily beats TAA and native AA methods in pretty much all games. Read any Techpowerup comparison about FSR vs DLSS or look at Techspots comparison where DLSS easily won across 26 different games.

I am using DLAA whenever I can. Best AA no doubt. Zero shimmering or jaggies.

AMD don't even have an answer to DLAA, DLDSR or DLSS 3/3.5

Also, ShadowPlay, RTX Video Super Resolution, CUDA, Reflex, way better RT performance and I could probably go on. AMD drivers are also wonky when you leave the most popular games and benchmarks. AMD tends to have garbage performance in early access games and lesser popular titles. 9 out of 10 early access games are optimized for Nvidia GPU.

All these features is the reason I am not even considering AMD GPU these days. They are years behind on features and seems to only focus on raster performance. Hence the price.
 
Last edited:
Generally good cards, not-so-great pricing. 4060 Ti is the underwhelming card of the stack in both price/performance and gen-on-gen improvement (or lack thereof).
 
The release got screwed over by them taking a cheaper route with the memory setup, and relying on DLSS to make it work.

GDDR6 came out with 4GB per chip (16Gb), and at higher speeds than the older, smaller modules.
That means you can now get higher VRAM capacity with half the complexity... but with half the bus width.

Energy efficiency: Sure, this came out great.

But did it come out great because it's good, or because the last gen were over-volted to the moon (Yes), and because they cut back on the RAM bus size massively due to moving to higher capacity VRAM modules? (Also yes)

The 3090Ti used a ton more power than the 3090 but was also more efficient
View attachment 313262
View attachment 313258
snipped image from here
But it's more efficient? How, asks the voice in my head?
View attachment 313259

Why little voicey... because they used half as many memory modules, so more of that power went to the GPU and not the VRAM.
The difference on the Ti is that each memory chip has twice the capacity, so there's only half as many required, which means no more memory chips on the back side of the card
During the eth mining boom, my 3090 would hit its 375W limit with the GPU clocked at 1.4GHz - super far down vs the 2.1GHz it can boost to when the ram is hardly in use.
Undervolting the GPU let me reach 1.6GHz at 370W, so it's easy to imagine the 3090Ti's higher capacity VRAM would have let it clock even higher with no other changes.



These new GPU's have the same deal
3060TI: 256 bit, 8x GDDR6 modules
4060Ti: 128 bit bus, x4 GDDR6 modules

It's almost like halving the amount of VRAM freed up more wattage, which massively boosts efficiency - but harms performance in bandwidth intensive situations.



At lower resolutions these changes don't impact them, and they're pretty good.
View attachment 313260

But at 4k, it's worse than what it replaced. All VRAM heavy titles will behave this way, and the gap will be bigger if you aren't running top of the line systems like TPU's review rig, where it's got stupid fast DDR5-6000 CL36 and PCI-E 5.0 - you need every nanosecond of lower latency to feed it new data, since it doesnt have the raw bandwidth to get the data across faster.
View attachment 313261
But who cares right, DLSS saves the day by rendering at a lower resolution.



Edited in, didn't even notice an 8GB variant existed. I cared so little about these cards.
The 8GB and 16GB both use the same amount of GDDR6 modules - just 4x2GB or 4x4GB modules.
The power consumption barely changed between them, because of this.

The 3060Ti needs a hecking lot more power - but ~15W from that ram going to the GPU would make a world of difference
View attachment 313263

Throw in the smaller gains from it being a newer GPU design even if it was just a refresh, a less complex PCB due to less wiring to VRAM, cheaper VRMs and all those things SHOULD have added up to a much cheaper card that was good value for money... and it wasnt. If you gamed at higher resolutions, you were paying more for less.
Are you going to play at 4k with a 4060 or a 4060ti? Assuming they had 512 bit bus and 48 gb of vram, they would still be kinda terrible for that resolution. I mean my 4090 struggles at newest games in 4k so..

Generally good cards, not-so-great pricing. 4060 Ti is the underwhelming card of the stack in both price/performance and gen-on-gen improvement (or lack thereof).
That's pretty insane considering the 4060ti is the 2nd best card in terms of price to performance, only losing to the just released 7800xt, and by less than 9% mind you.
 
Imagine how disruptive the 4080 would have been this generation had they priced it like the 3080 or even 4070ti really.... It would have been interesting. I'm sure when AMD seen the actual price they where like thank god a 900 usd 7900XT isn't so bad lol.


I think the bottom line is as consumers we really got screwed this generation there are a bunch of cards that perform similarly with different strengths and weaknesses. Anyone who thinks either lineup is all that consumer friendly drinks too much Red/Green Koolaid but that's just me. It just doesn't seem as bad because the last 3+ years have been crap just different levels depending on when.

Inflation is a big reason for 4000 series was priced higher. AMDs 7900 series were priced high as well, stop acting like it is only Nvidia. Last gen cards also had to be sold still. Both Nvidia and AMD had huge stock because of mining demand and then mining crash. AMD especially had huge stock. Which is why they waited a loooong time with 7700 and 7800 series, still selling 6000 series deep into 2023.

People are screaming about prices because they feel the inflation as well, on all fronts. People can't afford new hardware and holding on to their dated hardware, while talking crap about new stuff, because they can't afford to upgrade and trying to justify waiting. Human nature, nothing new.


Only meh cards this generation is 4060 series and 7600 series. These were almost pointless, at least until last gen cards sell out.
4060 series will probably beat entire Radeon 7000 series on Steam HW Survey anyway in terms of marketshare. Why? Because x60 series always sell like hotcakes, because 9 out of 10 people that are not into hardware won't even consider AMD because of reputation or bad experience earlier.
 
Last edited:
What and where do you imagine it would lead to? It's already very much a niche market in terms of Nvidias revenue stream. It's a bit sad I guess, but the simple fact is that gamers don't really mean a thing to their bottom line.

Only if you think nvidia-centric. EVGA left the sinking ship, XFX left the sinking ship... who's next? :D
Actually, I will happily appluad when nvidia leaves this market because I don't see its products competitive - let them give that occupied space to someone who makes graphics cards better.

Do you even remember all the shenanigans by the greens?

3.5GB GTX 970 https://www.extremetech.com/gaming/...-lawsuit-over-the-gtx-970s-3-5gb-memory-issue
Then RTX 4080-12 rebranded to RTX 4070 Ti when this card should be no more than RTX 4060 Ti. https://www.gsmarena.com/nvidia_rtx...ti_available_january_5_for_799-news-57062.php
Then the infamous geforce partner program which was heavily criticised for its monopolistic nature, and eventually was cancelled in 2018.

The Nvidia GeForce Partner Program was a marketing program designed to provide partnering companies with benefits such as public relations support, video game bundling, and marketing development funds.[1] The program proved to be controversial, with complaints about it possibly being an anti-competitive practice.[2] Nvidia canceled the program in May 2018.[3][4]
 
Only if you think nvidia-centric. EVGA left the sinking ship, XFX left the sinking ship... who's next? :D
Actually, I will happily appluad when nvidia leaves this market because I don't see its products competitive - let them give that occupied space to someone who makes graphics cards better.

Do you even remember all the shenanigans by the greens?

3.5GB GTX 970 https://www.extremetech.com/gaming/...-lawsuit-over-the-gtx-970s-3-5gb-memory-issue
Then RTX 4080-12 rebranded to RTX 4070 Ti when this card should be no more than RTX 4060 Ti. https://www.gsmarena.com/nvidia_rtx...ti_available_january_5_for_799-news-57062.php
Then the infamous geforce partner program which was heavily criticised for its monopolistic nature, and eventually was cancelled in 2018.

The Nvidia GeForce Partner Program was a marketing program designed to provide partnering companies with benefits such as public relations support, video game bundling, and marketing development funds.[1] The program proved to be controversial, with complaints about it possibly being an anti-competitive practice.[2] Nvidia canceled the program in May 2018.[3][4]
Think about it clearly. If the 4070ti should have been named a 4060ti, can you explain to me why it's beating the snot out of the 7800xt that's an xx80 competitor?

Nothing wrong with the 3.5gb 970 or the NGPP. Companies have been doing this on their own on both AMD and Intel platforms. You know for example, Hero and Apex mobos are Intel only, Crosshair mobos are AMD only etc. It has been happening for at least 15 years now. How is it anticompetitive, i've no idea.

Think about it, everything described in your link AMD is already doing it

The Nvidia GeForce Partner Program was a marketing program designed to provide partnering companies with benefits such as public relations support, video game bundling, and marketing development funds.
 
Only if you think nvidia-centric. EVGA left the sinking ship, XFX left the sinking ship... who's next? :D
Actually, I will happily appluad when nvidia leaves this market because I don't see its products competitive - let them give that occupied space to someone who makes graphics cards better.

Do you even remember all the shenanigans by the greens?

3.5GB GTX 970 https://www.extremetech.com/gaming/...-lawsuit-over-the-gtx-970s-3-5gb-memory-issue
Then RTX 4080-12 rebranded to RTX 4070 Ti when this card should be no more than RTX 4060 Ti. https://www.gsmarena.com/nvidia_rtx...ti_available_january_5_for_799-news-57062.php
Then the infamous geforce partner program which was heavily criticised for its monopolistic nature, and eventually was cancelled in 2018.

The Nvidia GeForce Partner Program was a marketing program designed to provide partnering companies with benefits such as public relations support, video game bundling, and marketing development funds.[1] The program proved to be controversial, with complaints about it possibly being an anti-competitive practice.[2] Nvidia canceled the program in May 2018.[3][4]
Sinking ship :laugh: :laugh: meanwhile Nvidia sits at 80-85% gaming marketshare according to Steam, pretty much owns the enterprise market and completely owns the high-end AI GPU market which is worth blillions upon billions. Nvidia stock is higher than ever for a reason. Stop trolling. Nvidia is doing superb right now. They will make 100s of billions over the next 5+ years. Nvidia predict to generate $300 billion in AI revenues by 2027, AMD can only dream about numbers like this but they are years behind on AI.

Even with huge AI focus, Nvidia can beat AMD on both enterprise and gaming on the side with ease.

AMDs big business is CPUs and APUs, not GPUs. They have been loosing GPU marketshare for years for a reason. They earn more per wafer by making CPUs/APUs.

If AMD actually had the best GPUs I would be using AMD GPU but they don't and lacks features like crazy + wonky drivers when you step outside of popular games (that gets benchmarked often - this is what AMD prioritizes). Most early access games runs like crap on AMD GPUs in comparison to Nvidia. Why? Developers optimize for 80-85% of users instead of 10-15%. Most developers are using Nvidia as well.
 
Last edited:
Are you going to play at 4k with a 4060 or a 4060ti? Assuming they had 512 bit bus and 48 gb of vram, they would still be kinda terrible for that resolution. I mean my 4090 struggles at newest games in 4k so..


That's pretty insane considering the 4060ti is the 2nd best card in terms of price to performance, only losing to the just released 7800xt, and by less than 9% mind you.

IMO a problem for the 4060 Ti is the still-extant 3060 Ti is much better value at this time compared to the 4060 Ti, whereas the other cards in the stack don't have the same kind of self competition deal with. It has newer features than the 3060 Ti and lower power consumption, but is that worth $130 more?

($130, based on this https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/33.html)
 
IMO a problem for the 4060 Ti is the still-extant 3060 Ti is much better value at this time compared to the 4060 Ti, whereas the other cards in the stack don't have the same kind of self competition deal with. It has newer features than the 3060 Ti and lower power consumption, but is that worth $130 more?

($130, based on this https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/33.html)
Maybe those prices apply in the US, in EU that is not the case. Cheapest 3060ti I can find in Germany is 370€, cheapest 4060ti is 409€. Cheapest 7800xt is 549€, cheapest 4070 is 599€.

Is FG and power draw worth 40€ between the 2 60ti's? Don't know honestly. Maybe. But the 4070 is definitely worth 50 over the 7800xt

If AMD actually had the best GPUs I would be using AMD GPU but they don't and lacks features like crazy + wonky drivers when you step outside of popular games (that gets benchmarked often - this is what AMD prioritizes). Most early access games runs like crap on AMD GPUs in comparison to Nvidia. Why? Developers optimize for 80-85% of users instead of 10-15%. Most developers are using Nvidia as well.
They have wonky drivers even in popular games. Take starfield for example. AMD cards don't render...the sun

Starfield on AMD

1694506086148.png


Starfield on nvidia

1694506102839.png
 
It gives you moar frames when or if you need them.


That's merely fanboi-talk and exactly nothing whatsoever to do with the technology. I didn't ask what you think about that horrid Green Monster's money-grabbing, I asked why DLSS is useless. Not the same thing.

Indeed it is, just as long as that particular aspect is all you care about.

Again, that has nothing whatsoever to do with the technology in itself. All extraneous factors excepted, is DLSS a good thing or a bad thing?


well, if you are after more numbers in corner of your screen rather than image quality I guess it works out good. If you are just after higher number on your screen - use fsr ehat is for free and can run on all three brends, woopsyy.

Money grabbing? Read again, even better, do it twice.

"additional" and as a customer you just needs to purchase new gpu from next gen to get next gen dlss while it can work on 3xxx series. I' sorry, but fanboysm talk is coming from you as now you clearly don't see the limitations.

You are defending limitations and are happy that company makes you buy stuff just to get now newest dlss version meanwhile 3060 ti eats 4060 with pork in games unless you use dlss 3 which by the way locked specially on 4xxx series to make 4xxx more appealing in games where dlss 3 is implemented. How many games are there with dlss 3 support?

DLSS gives you worse quality on Native res. Versions are specially locked behind gpu gen. I'll say again - nvidis logic - give customer less powerful gpu and dlss help us. Because numbers (fps) on peps screen is higher than without it. Who cares about quality.

I like these benching pipi talks. Fun to read when fanboysm is taking over.

Yep, I'm AMD fanboy who uses rtx gpu :D.


Is DLSS good or bad? Go read my previous post again and you'll see the answer.
 
Last edited:
well, if you are after more numbers in corner of your screen rather than image quality I guess it works out good. If you are just after higher number on your screen - use fsr ehat is for free and can run on all three brends, woopsyy.

Money grabbing? Read again, even better, do it twice.

"additional" and as a customer you just needs to purchase new gpu from next gen to get next gen dlss while it can work on 3xxx series. I' sorry, but fanboysm talk is coming from you as now you clearly don't see the limitations.

You are defending limitations and are happy that company makes you buy stuff just to get now newest dlss version meanwhile 3060 ti eats 4060 with pork in games unless you use dlss 3 which by the way locked specially on 4xxx series to make 4xxx more appealing in games where dlss 3 is implemented. How many games are there with dlss 3 support?

DLSS gives you worse quality on Native res. Versions are specially locked behind gpu gen. I'll say again - nvidis logic - give customer less powerful gpu and dlss help us. Because numbers (fps) on peps screen is higher than without it. Who cares about quality.

I like these benching pipi talks. Fun to read when fanboysm is taking over.

Yep, I'm AMD fanboy who uses rtx gpu.


Is DLSS good or bad? Go read my previous post again and you'll see the answer.

Yeah DLSS is so bad -> https://www.techpowerup.com/review/starfield-dlss-community-patch/
:roll:

"The default anti-aliasing method in Starfield is TAA (Temporal Anti-Aliasing), which results in a very blurry image at all resolutions, including 4K. It also fails to render small object details like thin steel structures, power lines, transparent materials, tree leaves, and vegetation well. Additionally, there are noticeable shimmering issues across the entire image, even when you're not moving, especially at lower resolutions like 1080p."

"The official implementation of FidelityFX Super Resolution (FSR) in Starfield addresses most of these problems but still exhibits excessive shimmering on thin steel objects, transparent materials, tree leaves, and vegetation. Enabling DLSS resolves these shimmering problems and ensures a stable image quality, even at lower resolutions like 1080p."

"The community patch also supports DLAA (Deep Learning Anti-Aliasing), which takes image quality a step further, surpassing the quality of native in-game TAA, FSR, or DLSS. Enabling DLSS/DLAA also enhances the quality of in-game particle effects, providing a more comprehensive and stable appearance, particularly during motion, compared to FSR."

Also, AMD GPUs don't even render the sun like fevgatos stated. Even tho it is an AMD sponsored and optimized for AMD CPU and GPU (which is why DLSS support is not present - AMD hates when FSR is compared with DLSS, because FSR always loose)

The game simply looks best on Nvidia, especially with DLSS/DLAA mod, easily beating native resolution visuals and FSR. TAA looks horrible.

I bet you don't even use Nvidia GPU :laugh:

TAA is garbage in most games. Most dev's prioritize DLSS/DLAA/FSR now. DLSS3 is in tons of games. You can easily mod most games by simply changing dll, there's several tools for this, very easy. Pretty much all new games have DLSS/DLAA (DLAA is a preset of DLSS now, meaning DLSS games have DLAA - making TAA useless)

DLAA is the best AA method hands down. DLSS works wonders as AA while improving perf and beating FSR in 9 out of 10 games, the last one is a draw meaning that FSR wins in zero games.

Aaaaaand this is why AMD GPUs are generally cheaper; Less and worse features, wonky drivers and lower resell value.

AMD don't even have an answer to DLAA or DLDSR which both are awesome features. 4K DLDSR on a 1080p or 1440p display looks insanely good. Close to 4K visuals without the performance hit of 4K and you can even use DLSS Quality on top to bump perf even higher while retaining most of the IQ.
 
Last edited:
Status
Not open for further replies.
Back
Top