• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia's GPU market share hits 90% in Q4 2024 (gets closer to full monopoly)

Status
Not open for further replies.
5070 Ti is definitely the better value at $150 more than the 9070 XT. Imagine being on RDNA 3 and learning you can't run FSR 4 while every card since Turing can do DLSS 4.
MSRP vs MSRP, I think the 5070Ti still has appeal for the feature set, but it's very close. Will that be worth it to literally everyone? of course not. But I guarantee it will for many.

Current retail pricing in many regions suggests the 9070XT is even better value than an MSRP comparison though.

It's also amusing (and the continuation of a repeatable pattern) that there was so much talk about 10 series owners being 'left out in the cold' when DLSS launched, or 30 series owners same sentiment when FG launched..... Not hearing the same about ≤RDNA3 and FSR4, but hey maybe that's just what I see.
Maybe it won't finewine like AMD does
From where I'm sitting it's Nvidia that's finewined harder than AMD in the last few generations. My GTX1080 gained support it never had (freesync) and that I never counted on getting, the 3080 benefitted from all the DLSS SR development over the years including being able to run the latest Transformer model. Now we have a 4080 equalling a 7900XTX in the 2025 test suite at 4k where it trailed it by a few % consistently at launch (similar story with a 3080 and 6900XT). Just a few examples that spring to mind that indicate to me the trend is more true for the opposite company.
 
MSRP vs MSRP, I think the 5070Ti still has appeal for the feature set, but it's very close. Will that be worth it to literally everyone? of course not. But I guarantee it will for many.
For people with very old cards and a lot of other people it's appealling. But will it flip NVIDIA users? Maybe a lot of the 30 series and prior card users, but otherwise, I don't think so. Compared to a 5070 Ti it's a good bargain, but compared to RDNA 3, RDNA 2? Not enough of a value or performance uplift to warrant a new purchase.
It's also amusing (and the continuation of a repeatable pattern) that there was so much talk about 10 series owners being 'left out in the cold' when DLSS launched, or 30 series owners same sentiment when FG launched..... Not hearing the same about ≤RDNA3 and FSR4, but hey maybe that's just what I see.
With the 10 series cards it's because they don't have the hardware, and the same's true with the old FG model. NVIDIA should definitely make FG happen on 30 series cards, though they might not.
From where I'm sitting it's Nvidia that's finewined harder than AMD in the last few generations. My GTX1080 gained support it never had (freesync) and that I never counted on getting, the 3080 benefitted from all the DLSS SR development over the years including being able to run the latest Transformer model. Now we have a 4080 equalling a 7900XTX in the 2025 test suite at 4k where it trailed it by a few % consistently at launch (similar story with a 3080 and 6900XT). Just a few examples that spring to mind that indicate to me the trend is more true for the opposite company.
How the turn tables.
I have to agree with this on drivers at least. I can get drivers that's just been release for win 10/11 for my GTX 980 Driver Results | GeForce GTX 980 | Windows 11 | NVIDIA where the card was released in 9/2014 but cannot get a recent driver for a Vega 56 that was released in 8/2017 Radeon™ RX Vega 56 Drivers
You are now aware the 980 has been supported since Windows XP.
 
It's also amusing (and the continuation of a repeatable pattern) that there was so much talk about 10 series owners being 'left out in the cold' when DLSS launched, or 30 series owners same sentiment when FG launched..... Not hearing the same about ≤RDNA3 and FSR4, but hey maybe that's just what I see.
Probably because not many people care about FSR. It's not hyped nearly as much as DLSS is, and its users acknowledge the fact that upscaling is a crutch for weaker hardware, not a must-have thing to rely on all the time.

From where I'm sitting it's Nvidia that's finewined harder than AMD in the last few generations. My GTX1080 gained support it never had (freesync)
Freesync is a project of AMD and VESA, so don't thank Nvidia for giving it to you. They only did because Gsync failed, being too expensive and locked.

and that I never counted on getting, the 3080 benefitted from all the DLSS SR development over the years including being able to run the latest Transformer model. Now we have a 4080 equalling a 7900XTX in the 2025 test suite at 4k where it trailed it by a few % consistently at launch (similar story with a 3080 and 6900XT). Just a few examples that spring to mind that indicate to me the trend is more true for the opposite company.
The 4080 and 7900 XTX have always been around the same level. A couple of % here and there doesn't matter.
 
But will it flip NVIDIA users? ....I don't think so.
That's kind of where I'm at too. AMD already have a loyal fanbase who decided they wanted this card anyway, just look over this forum or reddit for example, but their mountain to climb is conversions. I believe they have gotten some purely for having a card that launched in reasonable numbers and could be bought without paying WAY over MSRP while 50 series are limited stock. In a world where all cards are readily available at MSRP, I don't think they'd convert as many as they wanted so well played to some degree here taking advantage of the market situation.
With the 10 series cards it's because they don't have the hardware,
Oh I know and understand that and as a 1080 owner myself was not salty one bit, oftentimes advancement (and actually having a performant and good looking output) necessitates new hardware. Just look at FSR ≤3.1 vs what 4 is showing it can do.

I've also noticed that I don't see the "tensor cores are useless / a gimmick" argument anymore, now that AMD is including "AI accelerators"
Probably because not many people care about FSR. It's not hyped nearly as much as DLSS is, and its users acknowledge the fact that upscaling is a crutch for weaker hardware, not a must-have thing to rely on all the time.
Some users acknowledge that, you speak as if they all do. Some (many) users know that playing with DLSS/DLAA Tx model is the cleanest image quality you can get in games with forced TAA in the render path and thus is a highly desirable feature. FSR4 would appear to be showing similar, if slightly lesser, merits.
The 4080 and 7900 XTX have always been around the same level. A couple of % here and there doesn't matter.
It sure mattered to people here that the 7900XTX was consistently ahead at launch, so I think it's interesting that it has lost some of that ground relative to the 2025 test suite. If memory serves, it was asserted at the time that we were to expect the 7900XTX to increase that lead in raster, not decrease it.
 
Some users acknowledge that, you speak as if they all do. Some (many) users know that playing with DLSS/DLAA Tx model is the cleanest image quality you can get in games with forced TAA in the render path and thus is a highly desirable feature. FSR4 would appear to be showing similar, if slightly lesser, merits.
DLAA is not DLSS. The fact that you can use it in conjunction with DLSS doesn't make DLSS, or upscaling in general better.

It sure mattered to people here that the 7900XTX was consistently ahead at launch, so I think it's interesting that it has lost some of that ground relative to the 2025 test suite. If memory serves, it was asserted at the time that we were to expect the 7900XTX to increase that lead in raster, not decrease it.
I'm not sure about "people". To me personally, any difference below ~20% is too insignificant to talk about.
 
DLAA is not DLSS. The fact that you can use it in conjunction with DLSS doesn't make DLSS, or upscaling in general better.
Same AA algorithm though, the best temporal one even.

I know exactly where you personally stand on Upscaling, but suffice to say it's an important feature to a hell of a lot of people, especially ones in the market for a new card today, but even important relative to what they have access to right now. I can't and won't get on board with downplaying it and naysaying it based on what it's doing in principal, all that I care about is reality and the results. The reality is it's here to stay, and the results are improving every day. FG I consider an entirely different can of worms.
I'm not sure about "people". To me personally, any difference below ~20% is too insignificant to talk about.
To each their own, to me, the devil is in the details and there's a lot to discuss within 0 and 20%
 
Same AA algorithm though, the best temporal one even.
Completely different feature, completely different goal and effect.

I know exactly where you personally stand on Upscaling, but suffice to say it's an important feature to a hell of a lot of people, especially ones in the market for a new card today, but even important relative to what they have access to right now. I can't and won't get on board with downplaying it and naysaying it based on what it's doing in principal, all that I care about is reality and the results. The reality is it's here to stay, and the results are improving every day. FG I consider an entirely different can of worms.
Reality is subjective. Stating your reality is not "downplaying" or "naysaying". I don't know where so many people these days get the idea that you have to like everything.

I can equally say that upscaling is a crutch for weaker hardware / higher resolution (4K) gaming.

To each their own, to me, the devil is in the details and there's a lot to discuss within 0 and 20%
I respectfully disagree. We're taking about differences that you'd never notice without an FPS counter on screen.
 
Completely different feature, completely different goal and effect.
Best AA at native, best AA and output when upscaling from lower resolution. Semantics perhaps. At the end of the day I don't really care if people think of them differently, they're a bundled feature set that clearly is desirable to buyers.
I don't know where so many people these days get the idea that you have to like everything.
I never said you have to, just how I feel in particular about good super resolution upscaling, because I do like it and will almost without fail provide balance to a conversation when it is downplayed or naysayed, showing the other side of the coin, especially when some statements are made so broadly as if to be universal truths, a bugbear of mine.
respectfully disagree. We're taking about differences that you'd never notice without an FPS counter on screen.
I think that's a very broad reaching statement that I can't agree with, there is so much nuance worth exploring within a 20% delta across many titles IMO.

Happy to agree to disagree, on all points really lest we continue to go in circles :peace:
 
Best AA at native, best AA and output when upscaling from lower resolution. Semantics perhaps. At the end of the day I don't really care if people think of them differently, they're a bundled feature set that clearly is desirable to buyers.
I never said DLAA was bad. All I'm saying is that mentioning it in context of DLSS is a deflection of the topic. Sure, you can use them together. Does that have anything to do with DLSS alone? No.

I never said you have to, just how I feel in particular about good super resolution upscaling, because I do like it and will almost without fail provide balance to a conversation when it is downplayed or naysayed, showing the other side of the coin, especially when some statements are made so broadly as if to be universal truths, a bugbear of mine.
You can feel how you want. That doesn't mean that "the other side of the coin" isn't a valid point of view. The world doesn't exist to validate your feelings. And that's not "naysaying", just fact.

I think that's a very broad reaching statement that I can't agree with, there is so much nuance worth exploring within a 20% delta across many titles IMO.

Happy to agree to disagree, on all points really lest we continue to go in circles :peace:
Sure, let's agree to disagree on that.
 
I have not owned a red GPU since the days they were called ATI. The 9000 series swayed me. I guess we'll see if I end up regretting this.
I am watching you, and others like you :D

I am going to pass my 4070Ti down to my son soon..
 
That doesn't mean that "the other side of the coin" isn't a valid point of view.
I don't remember saying your point of view was invalid, I just shared my opinion which happens to be fairly opposite, and that I will typically jump in and provider a counter opinion/argument when (good) upscaling is talked about negatively, or for reasons I don't agree with. Nor did I assert the world exists to validate my feelings. This is a very odd line of thinking and perhaps you're not understanding me correctly or making assumptions.

It's super valid to not like upscaling whatsoever, I think it's completely daft to, but it's absolutely anyone's prerogative to not like anything they want, for any reason they want. I don't see why that would stop me engaging/countering with that line of thinking in a public forum with my own opinions or arguments. In fact I think the balance that showing both sides of a coin can provide is valuable. Surely you can see I have negative points of view that yourself and others feel compelled to engage with, especially on this forum if it's about AMD.
 
I don't remember saying your point of view was invalid, I just shared my opinion which happens to be fairly opposite, and that I will typically jump in and provider a counter opinion/argument when (good) upscaling is talked about negatively, or for reasons I don't agree with. Nor did I assert the world exists to validate my feelings. This is a very odd line of thinking and perhaps you're not understanding me correctly or making assumptions.

It's super valid to not like upscaling whatsoever, I think it's completely daft to, but it's absolutely anyone's prerogative to not like anything they want, for any reason they want. I don't see why that would stop me engaging/countering with that line of thinking in a public forum with my own opinions or arguments. In fact I think the balance that showing both sides of a coin can provide is valuable. Surely you can see I have negative points of view that yourself and others feel compelled to engage with, especially on this forum if it's about AMD.
Don't get me wrong, I fully respect your opinion. If DLSS works great for you thanks to DLAA, that's awesome. But that's DLAA's merit, not that of DLSS.

Upscaling always gives you more performance in exchange for a lower quality image. That's what it does. The fact that you can improve on your image quality, even beyond native sometimes thanks to DLAA doesn't prove upscaling to be good. It only proves DLAA to be good. They're two entirely different topics.

So if you feel inclined to jump in to defend upscaling, then please allow me to jump in myself to set our facts straight. You can still like upscaling all you want, it doesn't bother me. :)
 
I use it on certain titles. Works great. My native resolution is 3840x2160.
 
The fact that you can improve on your image quality, even beyond native sometimes thanks to DLAA doesn't prove upscaling to be good. It only proves DLAA to be good. They're two entirely different topics.
I don't think they're different topics given that DLAA uses the exact same pipeline as DLSS, but working on a native resolution input instead of a lower res one.
 
Upscaling always gives you more performance in exchange for a lower quality image. That's what it does. The fact that you can improve on your image quality, even beyond native sometimes thanks to DLAA doesn't prove upscaling to be good. It only proves DLAA to be good. They're two entirely different topics.
The crux here is, what is the alternative? if the (perhaps only) alternative rendered at native has worse image quality than DLSS upscaling, even if we assume performance parity, then it does prove DLSS to be good. All new dynamic if we consider performance too. Different topics perhaps, but there are a lot of 'facts' to consider here to uhhhh... set straight. It's easy for us both to be selective about them.
 
I don't think they're different topics given that DLAA uses the exact same pipeline as DLSS, but working on a native resolution input instead of a lower res one.
What pipeline it uses doesn't matter. It's not the same thing.

Windows and Linux aren't the same thing just because they both run on an x86 CPU.

The crux here is, what is the alternative? if the (perhaps only) alternative rendered at native has worse image quality than DLSS upscaling, even if we assume performance parity, then it does prove DLSS to be good. All new dynamic if we consider performance too. Different topics perhaps, but there are a lot of 'facts' to consider here to uhhhh... set straight. It's easy for us both to be selective about them.
Native never has worse quality than DLSS. You're perhaps talking about TAA and DLAA, which are indeed entirely different topics. DLAA doesn't prove DLSS to be good, neither does TAA in X game being bad.
 
Native never has worse quality than DLSS.
Yes, Native DLAA will always look better than DLSS using the same model. Agreed. I again don't remember saying the opposite. but the realities of playing a given game present many different combinations of settings and what the game will allow you to do, and it is a fact that over many games it has been the case that DLSS Q provides better image quality than Native TAA, and DLAA was not an option, ergo in that situation, within those limitations, it proves upscaling to be good.

I also agree with @igormp, same pipeline, same tech, they're one and the same with different input resolutions, they're intrinsically linked.
 
What pipeline it uses doesn't matter. It's not the same thing.
IMO it still is.
Windows and Linux aren't the same thing just because they both run on an x86 CPU.
I don't think that's a fair equivalence whatsoever, but you do you.
Native never has worse quality than DLSS.
It is possible.
Also, that perception of "native being better because raster" is kinda silly in my opinion, quoting myself from another thread from some time ago:
Reminder that the only actual "native" thing for us would be you actually seeing something with your own eyes.
Renders are just some math that try to approximate what we have in the real world. Our current approximations for games (in raw raster) is reasonable, but far from good given the real time constraints. We can achieve better quality with longer renders that do more math, and more precise math (that are still far from real life, but improving), but that'd be useless for games given the speed.

Upscaling with machine learning is nothing more than another different way to try to do this approximation in a non-deterministic way.
 
Yes, Native DLAA will always look better than DLSS using the same model. Agreed. I again don't remember saying the opposite. but the realities of playing a given game present many different combinations of settings and what the game will allow you to do, and it is a fact that over many games it has been the case that DLSS Q provides better image quality than Native TAA, and DLAA was not an option, ergo in that situation, within those limitations, it proves upscaling to be good.

I also agree with @igormp, same pipeline, same tech, they're one and the same with different input resolutions, they're intrinsically linked.
DLAA isn't native. It's higher than native, downscaled. Of course it's better than native, that goes without question. Whether TAA is native or not is questionable. Some games force it by default which is bad.

Ok, then an apple and a cucumber are the same thing IMO. They're both green things growing on plants, so they're the same.

DLSS = Lower resolution image upscaled to your monitor's native.
DLAA = Higher resolution image downscaled to your monitor.

How these can be called the same is beyond me. They're complete and total opposites that just happen to work well together.
 
DLAA isn't native. It's higher than native, downscaled. Of course it's better than native, that goes without question. Whether TAA is native or not is questionable. Some games force it by default which is bad.


Ok, then an apple and a cucumber are the same thing IMO. They're both green things growing on plants, so they're the same.

DLSS = Lower resolution image upscaled to your monitor's native.
DLAA = Higher resolution image downscaled to your monitor.

How these can be called the same is beyond me. They're complete and total opposites that just happen to work well together.

You're thinking of DLDSR or DSR which performs downsampling from a higher rendered resolution. DLAA is just DLSS where the input and output are the same (native) resolution, or in other words the render resolution is 100%.
 
Upscaling with machine learning is nothing more than another different way to try to do this approximation in a non-deterministic way.
This harkens back to the notion I've said before, some people are against the tech from a purely academic "what it is doing" standpoint, rather than the actual results. The proof is in the pudding.
DLAA isn't native. It's higher than native, downscaled. Of course it's better than native, that goes without question. Whether TAA is native or not is questionable. Some games force it by default which is bad.
No, DLAA is the just AA algo from DLSS used on a native resolution render, you're perhaps thinking of DSR/DLDSR.
 
Take AMD off, put ATi back on, return to Canada. We will fix it.
Pssttt they are still known as ATi internally

I have not owned a red GPU since the days they were called ATI. The 9000 series swayed me. I guess we'll see if I end up regretting this.
You have been missing out...
 
Status
Not open for further replies.
Back
Top