• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Grabs Market Share, AMD Loses Ground, and Intel Disappears in Latest dGPU Update

Low quality post by Dr. Dro


1749475463987.png


I made sure to point out that DRS is enabled and set to 144 fps, like, since it's the second thread hijacked over this exact subject
 
Low quality post by JustBenching
View attachment 403104

I made sure to point out that DRS is enabled and set to 144 fps, like, since it's the second thread hijacked over this exact subject
Even with that on 0 chance you get 160 fps with the game maxed out. Unless AFMF is on in the driver
 
Lots of products that have released in 2023 and 2024. Easy example, the whole Barcelo-R lineup is made of vega igpus. 7330u for example - they even name as they are brand new lol.

That right there is true. I had to run out and get a laptop for my stepson recently with just a day to research, and AMDs naming convention is hyper deceptive in the mobile space. There area ton of 7520U and 7530U Zen 3 / Vega based brand new laptops on the market. These are rebranded N7 node based Cezanne parts, many released in early 2021. AMD clearly used this deceptive naming convention as a means to bloat the value of their older chips, probably to avoid a write-down on old inventory.
 
I've been reading that thread with genuine curiosity and wondering what sort of machines they are sticking these cards into. After my 6800xt died I got my 9070 xt Red Devil 6 weeks ago and it's been rock solid so maybe I've been lucky as it runs faster than most oc 5070ti scores I've seen with minimal tuning.

You're sarcasm aside, projected service life was my only disappointment with my 6800xt I fully expected it to last until next gen. It was completely unexpected and the last time I had such a failure was a GTX 480. I can't complain it just made its value proposition worse in lost service time and no resale value will definitely affect my decision making next generation.

If you don't mind my asking, what was the AIB of your failed 6800XT? Also, just curious as to your comment of "decision making next generation", would that be AIB brand or Nvidia/AMD?
 
Low quality post by Dr. Dro
Even with that on 0 chance you get 160 fps with the game maxed out. Unless AFMF is on in the driver

From what I can extrapolate from W1zz's latest review data, that seems to be the case. With DRS off, even the RTX 5090 will need frame generation to venture into the triple digit realm comfortably

rt-spider-man-2-3840-2160.png
 
Low quality post by blinnbanir
This is actually sad. If I was using AMFM or whatever it would not show Global Experience in AMD software. It is always the same people that don't understand. You see I bought Spiderman 2 after it showed 27 FPS in the TPU review and it was on sale. What none of you can seem to understand is having an all AMD build that is faster than the PS5 will mean that you will get great performance when they are ported over to PC. Are you enjoying Gamepass? You can thank AMD for that too as the same Games are on the Xbox. That fact alone is enough to allow for someone that owns AMD to spend more time enjoying their product than belly aching about Nvidia. I do agree on the market share argument but it is not because AMD is bad at Gaming. The glaring issue though is that DGPU is no longer the most viable way to get into Gaming. I am willing to bet that there are more Steam Decks sold than any DGPU or laptop that is more expensive. That is why I love Gaming on my 800+ library Game PC and the only Game that makes my PC not sing is City Skylines 2 but you need high population and city to see that. The best though is I gifted my Nephew my previous build and he loves it. That is all AMD too. I can't speak to why the 7900XT shows 27 fps in that review but I know I get way more than that if you choose to not believe that what can I do? So just stop attacking me as if I just pulled the numbers out of a hat. This reminds me of the Starfield flame up or even Avatar. While people were complaining online about no DLSS AMD users were busy enjoying the Game and wondering what all the fuss was about.
 
Low quality post by Vayra86
I have seen plenty of comments on this very site claiming that the 4090 and 5090 are the only cards that can do 4K. I am using a 4K screen and do not have a 4090/5090 so obviously I do not believe it
Sure there are people on the internet saying odd shit to bait a response. What else is new? That doesnt mean its real.

FWIW I dont get why tf we are talking about Spiderman now either...
 
Low quality post by JustBenching
Because misinformation isn't welcome in a tech forum, is it? Say someone actually believes him and buys a 7900xt and expects to get 160 fps at 4k maxed out spidey 2 only to find out he is actually getting 26.
Nah its offtopic
 
Because misinformation isn't welcome in a tech forum, is it? Say someone actually believes him and buys a 7900xt and expects to get 160 fps at 4k maxed out spidey 2 only to find out he is actually getting 26.

Indeed
 
This is actually sad.

I agree, but not for the reasons you posed.

If I was using AMFM or whatever it would not show Global Experience in AMD software

Not necessarily.

It is always the same people that don't understand.

I believe it's more accurate to say that it's always the same people willing to engage in discussion

You see I bought Spiderman 2 after it showed 27 FPS in the TPU review and it was on sale.

A sale is always nice, I guess! Happy gaming.

What none of you can seem to understand is having an all AMD build that is faster than the PS5

While true, this is not because of the system being all-AMD, but rather, because the PS5 hardware is rather anemic and out of date. The same can be said for Intel+AMD, Intel+Nvidia and AMD+Nvidia builds. Perhaps, even the odd Arc A770 and B580 builds, GPUs which are in far less machines than I'd hoped for.

will mean that you will get great performance when they are ported over to PC.

Not true. There is no evidence whatsoever to support this claim, and plenty of evidence on the contrary.

Are you enjoying Gamepass? You can thank AMD for that too as the same Games are on the Xbox.

Xbox Game Pass is simply a low cost way to rent games. AMD has no involvement with it. I've cancelled my subscription years ago, as I stopped using my Xbox One S and I can just buy the odd game I'm interested on through Steam.

That fact alone is enough to allow for someone that owns AMD to spend more time enjoying their product than belly aching about Nvidia.

Yet... here you are.

I do agree on the market share argument but it is not because AMD is bad at Gaming.

Loaded statement. They aren't bad per se, but they are in no position to claim a spot as a premium graphics vendor. The low market share is a reversible situation, as long as they stay the course and continue to improve their products. Even now with RDNA 4, they aren't exactly matched up with the competition. It's no longer the horrible mess it used to be, though, and over the generations they have made quite a bit of progress. Which I expect to speed up now that the foundational work (drivers, hardware that has the appropriate features such as tensor cores and a competent video coding engine) has been laid.

The glaring issue though is that DGPU is no longer the most viable way to get into Gaming.

It is the only viable way to have a solid gaming experience on a desktop PC, and the only way to have any form of high-end experiences on a laptop computer.

I am willing to bet that there are more Steam Decks sold than any DGPU or laptop that is more expensive.

Not the case, due to regional availability concerns. It is true that iGPUs are ubiquitous in this space. However, even with AMD's great advances in the mobile space and their excellent mobile SoCs, the lion's share of mobile processors (and consequently, iGPUs) still belong to Intel.

That is why I love Gaming on my 800+ library Game PC and the only Game that makes my PC not sing is City Skylines 2 but you need high population and city to see that.

It is the least to be expected of a 7900 XT. I've also amassed a great library of games over the 16 years I've been gaming on PC, and I believe there is not one of them that wouldn't run on your machine. However, it is equally true that there are games where faster hardware would be of benefit, if the eye candy and/or resolution are turned all the way up.

The best though is I gifted my Nephew my previous build and he loves it. That is all AMD too.

I regularly give away my old gear to my brother for him to come up with builds for himself. His main PC still runs the Ryzen 9 3900XT I owned back in 2020, before I upgraded to the 5950X. He was satisfied and did not want the 5950X once I retired it, so I sold it and bought the i9-13900KS I currently own, as the 7950X3D was relatively delayed and I didn't want an 8-core processor. Turns out that was the right call to make at the time, since the Core Ultra is not a good processor and the 16-core X3Ds have that split topology issue, so the wait continues for a 12-core+ 3D CCD chip.

I can't speak to why the 7900XT shows 27 fps in that review but I know I get way more than that if you choose to not believe that what can I do? So just stop attacking me as if I just pulled the numbers out of a hat.

Nobody is attacking you, however you are doing precisely that. There's no way to validate your claims because they don't line up with the experiences that are reported by both users and reviewers, not to mention the theoreticals don't add up. If a RTX 5090 which is showing 3 times the performance of your GPU is falling far short of that claim, the evidence is overwhelmingly against you.

This reminds me of the Starfield flame up or even Avatar. While people were complaining online about no DLSS AMD users were busy enjoying the Game and wondering what all the fuss was about.

A mod to implement DLSS 3 with frame generation was released for Starfield before Bethesda released the patch adding a native implementation. It worked well. But this was largely niche amongst the player base, whoever could run DLSS frame generation in Ada's earlier days would just download that mod and that was it. Noteworthy that this precludes the release of FSR 3's frame generation feature.

...just don't underestimate the audience, it's a critical mistake.
 
I didn't mention it. But I bought a FreeSync monitor 10 years ago and bought an AMD graphics card a year later specifically because it supported FreeSync and I believed at the time Nvidia and Intel would one day support FreeSync as well. (And my next two primary GPUs were AMD for other reasons.)

But my cheap and early adopter FreeSync monitor only offers VRR from 45Hz to 65Hz. It's rarely helpful. AMD introduced FreeSync Premium more recently to address this but should've done that from the start, as Nvidia did with strict G-Sync rules. (I'm okay with FreeSync being offered in poor implementations like mine but clear marketing for good implementations is essential.)

The reason I didn't mention FreeSync is becuase it was a response to the idea of VRR that Nvidia came up with and because my vendor-agnostic buying priorities seem to be shared by no one else and because G-Sync generally remained the best option for people who prioritized VRR and because FreeSync didn't seem very well marketed. Like I wanted that wind turbine demo AMD made but it was only shown to the press, as if the rest of us weren't allowed to see clear demos of how helpful VRR is.
The ASUS MG278Q monitor has a variable refresh rate (FreeSync) range of 35Hz to 144Hz, and was announced in 2015... not sure when I purchased it but pretty sure I purchased an R9 390 around that time and 5700 XT as a later upgrade . Also, i was able to download that windmill demo back then, not sure why you couldn't find it unless I looked after you did.
 
Well, AMD and Intel have only themselves to blame.

Intel should have secured more production. And fixed its driver overhead issue. If the driver overhead thing wasn't a problem and gpus were readily available at msrp prices, then it would have done better.

AMD screwed up badly. Their GPUs are prices way too high. I agree with gamers nexus who said it should be 25% cheaper because AMD is behind Nvidia on the features front. AMD playing catchup. I can get a 5070 ti for not much more than a 9070xt. A 5060ti 16gb is also roughly close to same price as 9060xt. Supposedly there are cheap 9060xts and 9070xts but never in stock.

Anecdote.

My personal experience is that I could not find a 5070 or 5070ti at MSRP or anywhere near it. It competes at the level of the 9070xt...and I was lucky enough to snag one of the things with a decent cooling system and higher power target for well under the cheapest Nvidia offering (about $120 less). What I've found is the 9070xt and the 3080 are pretty similar in everyday performance (running a dual screen QHD setup). Where the two differ is power draw (by just a bit), noise (my 3080 was bare bones...so maybe not as fair for the model so much as my particular SKU), and whether or not I have to deal with frame delivery burps...which anecdotally is less on the 9070xt. I...am pretty happy with the 9070xt overall, but I know that is anecdote.

At this point, I want AMD to pull out of the gaming market just to see what else will Dear Leader Jensen pull and how much more the Ngreedia fanbois will take before finally saying its enough abuse.

One word. Nintendo.

First they didn't have to compete against Sega (consoles). Then, they didn't have to compete against Sony (handhelds). Now they aren't engaging with the market, believe themselves a premium experience, and have priced themselves as such. Nvidia, sans the competition of AMD, will inevitably launch the Virtual boy and there will be a legion of people who defend it. I...don't want that again. I am also happy if we decide 4K is just not possible without better optimization, and the PS5/Xbox series generations are lost to history forever in return for the next generation of consoles focusing on putting out a ton of games instead of the 5 tentpole releases a year that keep them on life support. That is at the expense of having a generation of graphics regress, so that gameplay is more important than visuals again.
 
The ASUS MG278Q monitor has a variable refresh rate (FreeSync) range of 35Hz to 144Hz, and was announced in 2015... not sure when I purchased it but pretty sure I purchased an R9 390 around that time and 5700 XT as a later upgrade . Also, i was able to download that windmill demo back then, not sure why you couldn't find it unless I looked after you did.
I'm glad there were good options. I should've made good VRR a bigger priority back then. I did some searching the other day and while the windmill demo is hard to find today, it sounds like it was available in the past. And in my searching I found AMD's more recent Oasis HDR FreeSync 2 demo which is pretty good.
 
No matter whom they employ they still committed to very poor management. Misused geniuses also exist.

In December, I was a wee baffled by the fact there's no AMD GPU release. The perfect moment: NV have nada to offer, DLSS4 doesn't exist, peeps are running shopaholics because it's Christmas and stuff, yet AMD keep it vocal, not physical. In January, I started being really afraid it's gonna be same shit again. And by February, it's become confirmed: AMD didn't even try to be decent. By March, I realised I was wrong: it's not the same shit again, it's worse shit again. And now... Well. Buying an AMD GPU has never been more fanboyish than it is now.

Further AMD GPU division actions effectively don't matter from now on. They try fixing it, no one will care. They keep it up, no one will care. They sell out, no one is genius and enthusiastic enough to make it work. They vanish, no one will care, too. Buying an AMD GPU now is like dancing on a mine field. They can stop making new drivers at any given moment. They bulldozed themselves much harder with RDNA4 than they did with FX processors.
I dont disagree with what youre saying here. AMD is a smaller company than Nvidia. They cant afford to compete with their R&D and wafer allocation. I dont know that they can produce more gpus or sell them much cheaper than they already are as price cuts dont usually come this early.
I miss ATI, and my trust x1900 gpus. The only reason I went Nvidia back when I did was CUDA for my pro apps, as gaming was secondary.
 
I dont disagree with what youre saying here. AMD is a smaller company than Nvidia. They cant afford to compete with their R&D and wafer allocation. I dont know that they can produce more gpus or sell them much cheaper than they already are as price cuts dont usually come this early.
I miss ATI, and my trust x1900 gpus. The only reason I went Nvidia back when I did was CUDA for my pro apps, as gaming was secondary.

It's been 20 years. It's time to realize the "ATI" that made the X1K GPUs are a footnote in history.
 
It's been 20 years. It's time to realize the "ATI" that made the X1K GPUs are a footnote in history.

I must admit, AMD are on the right track with FSR4 and their 9 series RT performance improvements but the problem with this statement is that Nvidia is leading the way in technology and AMD has been playing catchup.

They will need to bring some form of their own technology (does RTG still exist?) to the table to get a type of leading edge over Nvidia that should pave the way for at least some sort of transformation into bigger sales numbers.
 
Last edited:
It's been 20 years. It's time to realize the "ATI" that made the X1K GPUs are a footnote in history.
True. But the only thing that has really changed, is the pricing.

I must admit, AMD are on the right track with FSR4 and their 9 series RT performance improvements but the problem with this statement is that Nvidia is leading the way in technology and AMD has been playing catchup.

They will need to bring some form of their own technology (does RTG still exist?) to the table to get a type of leading edge over Nvidia that should pave the way for at least some sort of transformation into bigger sales numbers.
They have attempted this in the past by jumping on new hardware early, like gddr4, hbm, h2o cooling. How can they compete with a company that has endless funds for R/D. Only way I see that happening is Nvidia abandoning gamers for Ai more than they already have. Gamers are still eating up their gpus at insane prices and cant get enough, even though they have lackluster performance per dollar.
 
They have attempted this in the past by jumping on new hardware early, like gddr4, hbm, h2o cooling.

I suppose this goes all the way back to pixel shader 3.0 and possible beyond that although I'm not familiar with hardware before my era.

NVidia have led the way with Cuda, Gsync, RTX, DLSS, PT and MFG to name a few. AMD just seems to be following NVidia's lead.

How can they compete with a company that has endless funds for R/D.

Well, they need to come up with something. I understand it's not going to be easy, but they need to break this pattern somehow to get dGPU sales back in their strides. Their 9xxx series has been a hit I believe but will need to be more innovative moving forward. Something to break Nvidia's trend.

Does AMD really need to accomplish this in order to be more competitive in the dGPU landscape? I'd say so because all they seem to be doing is following NVidia's tracks atm.
 
Only way I see that happening is Nvidia abandoning gamers for Ai more than they already have.
Where does this rhetoric originate from?

DLSS transformer model, MFG, new primatives, neural rendering. Those are all gaming features.

If anyone has abandoned gamers it’s been AMD. AI upscaling five years after Nvidia for example. New hardware required.

It’s been a decade or longer since AMD did any innovating for gamers.
 
They cant afford to compete with their R&D and wafer allocation
Even if this is true it doesn't excuse their scam after scam after scam. They renamed their GPUs to look more like nVidia. They released these GPUs horribly late. They released these GPUs with fake MSRPs and abysmal availability. They placed them right at the same price brackets. Not to mention how poor their software advancements are.
And I dare to remind you they hyped RDNA4 up as if they're about to flood the market with insane value products. Where are they? 9070 XT is the same crap as 5070 Ti at best, oftentimes it's same level expensive which means it's even worse because it's less feature-rich and also slower on average. 9060 XT 16 GB is a barely okay product, and the 8 GB version is the same bull as 5060 8 GB but with a different paintjob.

I hate everything about it. They couldn't have released anything more cursed than that.
 
Where does this rhetoric originate from?

DLSS transformer model, MFG, new primatives, neural rendering. Those are all gaming features.

If anyone has abandoned gamers it’s been AMD. AI upscaling five years after Nvidia for example. New hardware required.

It’s been a decade or longer since AMD did any innovating for gamers.
I was talking about the last couple of years.... as in all of their gpus the past two gens, especially the 5 series have been lackluster as far as hardware is concerned (minus the 90 series that the majority cannot afford). I guess I am showing my age. My first AMD (ATI) card, was a Radeon 7000 and Nvidia, a Geforce 2. I was gaming on high settings at 1080p since before AMD even purchased their gpu division. And guess what, I still am! Albeit at 250hz. Believe it or not AMD and Nvidia have been closely matched much of history. It was when their CPUs needed redesign to match Intel that the GPU R/D was halved and I dont think they ever recovered from that kick in the pants. BUT, AMD is clawing back Adobe and other pro apps, which for me, is key. And then there is the performance in COD-HC MP, which is the only game I play anymore. AMD wins there too. I dont use upscaling, ray tracing, etc etc. It kills my latency and FPS.

Nvidia has been forced to become a gaming software company for several reasons, one being hardware just cant keep up with development demand. That's not necessarily a bad thing, but they have abused their position and the consumers with their pricing scheme. I you may been too young to remember SLI, a marketing scheme to get consumers to purchase two cards... or Ray Tracing! Man, that garbage took off fast, didnt it? Again, failed because their hardware could barely run their new "feature" for, I dont know, 4 generations of their gpus. Love me some Doom II RTX, however.

My next card, like the last two I purchased, will be Intel (for my workstation). And then well see where the market is after all this Ai business is over, and maybe I will replace my 4070Super.
 
True. But the only thing that has really changed, is the pricing.


They have attempted this in the past by jumping on new hardware early, like gddr4, hbm, h2o cooling. How can they compete with a company that has endless funds for R/D. Only way I see that happening is Nvidia abandoning gamers for Ai more than they already have. Gamers are still eating up their gpus at insane prices and cant get enough, even though they have lackluster performance per dollar.
Doesn't really look like they want to compete, to me. Seems like they're using their production capacity to make more CPUs instead.
Even if this is true it doesn't excuse their scam after scam after scam. They renamed their GPUs to look more like nVidia. They released these GPUs horribly late. They released these GPUs with fake MSRPs and abysmal availability. They placed them right at the same price brackets. Not to mention how poor their software advancements are.
And I dare to remind you they hyped RDNA4 up as if they're about to flood the market with insane value products. Where are they? 9070 XT is the same crap as 5070 Ti at best, oftentimes it's same level expensive which means it's even worse because it's less feature-rich and also slower on average. 9060 XT 16 GB is a barely okay product, and the 8 GB version is the same bull as 5060 8 GB but with a different paintjob.

I hate everything about it. They couldn't have released anything more cursed than that.
Agreed, it's just crap and more crap. Every GPU they've released should have a $50 cheaper MSRP but it wouldn't have mattered anyways since they'd just be $300 above MSRP compared to NVIDIA's $100.
 
Nvidia has been forced to become a gaming software company for several reasons, one being hardware just cant keep up with development demand.
Hardware has never been able to keep up with software. Remember “But can it run Crysis?”?

they have abused their position and the consumers with their pricing scheme
So this is your real issue. Yet, AMD is doing the same.

I you may been too young to remember SLI, a marketing scheme to get consumers to purchase two cards
If your first video card was a HD 7000 I can promise you I have tons more experience than you.

Funny you don’t think Crossfire was “a marketing scheme to get consumers to purchase two cards”. (Thanks for the additional example of AMD being late to the game - it took them seven years to come up with a SLI competitor. It also required a specialized motherboard and a very expensive master graphics card.)

Oh, and btw SLI was invented by 3Dfx for the Voodoo 2 card in 1998. Nothing to do with Nvidia besides Nvidia buying the technology from 3Dfx when they went bankrupt.
 
Last edited:
Back
Top