• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Grabs Market Share, AMD Loses Ground, and Intel Disappears in Latest dGPU Update

My reasoning is will spending the extra 70€ for 15% performance (which is incorrect in real world performance based my own card which is faster than a 5070ti) benefit the buyer?
Yes, if he uses RT and PT then the extra performance will benefit the buyer. I mean you did the exact same thing yourself by buying the 9070xt over the normal 9070.

Saying it's incorrect in real world performance is just not correct. Wanna try a game with RT? I have a bunch of 5070tis - we can compare with your 9070xt if you wish.
 
Yes, if he uses RT and PT then the extra performance will benefit the buyer. I mean you did the exact same thing yourself by buying the 9070xt over the normal 9070.

Saying it's incorrect in real world performance is just not correct. Wanna try a game with RT? I have a bunch of 5070tis - we can compare with your 9070xt if you wish.
But will it really? I mean if your playing a RT/PT game then you're also probably using DLSS or FSR so no your extra RT performance is not required as daddy Jenson says you've got a 4090 with DLSS 4 /s. You can of course try to run these games without frame gen but on that class cards your not in for a great time for the most part. This is the paradox of modern GPU's, buy our super dupper fastest best hardware then use software AI generated frames to make your new hardware sing. It's all a magic show at this point.
 
You're sarcasm aside, projected service life was my only disappointment with my 6800xt I fully expected it to last until next gen. It was completely unexpected and the last time I had such a failure was a GTX 480. I can't complain it just made its value proposition worse in lost service time and no resale value will definitely affect my decision making next generation.
That legitimately sucks and I'd expect any hardware of the same vintage to be kicking strong today. I'm sorry that happened.

Curiously, what's your plan for replacing it?
 
But will it really? I mean if your playing a RT/PT game then you're also probably using DLSS or FSR so no your extra RT performance is not required as daddy Jenson says you've got a 4090 with DLSS 4 /s. You can of course try to run these games without frame gen but on that class cards your not in for a great time for the most part. This is the paradox of modern GPU's, buy our super dupper fastest best hardware then use software AI generated frames to make your new hardware sing. It's all a magic show at this point.
I do not get the argument "since you are using DLSS / FSR extra performance isn't required". But anyways, i was mostly addressing the part where you didn't believe that the 70ti is in fact a bunch faster in RT.
 
I'm running an 9070xt Red Devil until next gen.
Solid choice! Especially if you got it at a great price. Should hopefully last longer even than the last card too, which just seems like back luck that it died.
 
So I looked for the sources, and they, "unexpectedly", reached to Steam hardware survey.
Do you guys know which AMD card is the most used?
Radeon(TM) graphics
Radeon graphics

Steam apparently reports all users with amd igpu and dgpu as only those two gpus above.
And they are combined of 4.5% of users.
Kinda, weird, isn't it? There is no way 4.5% of steam users would use igpu for gaming.

So this report it rigged and has no real value.
AMD could have as much as 12-16% share.
They have sold damn lot of 9070(xt).

And Steam isn't they only platform that is used.
 
Solid choice! Especially if you got it at a great price. Should hopefully last longer even than the last card too, which just seems like back luck that it died.
It's great but I'll only keep it until next generation. As good as it is I honestly don't think it's got the legs I want a 5-7 year card as I always run water cooling and I'm not putting a block on a 70 class card. I was all set for a 5090 this generation but lack of power load balancing has knocked that on the head.

I know there are solutions coming in the form of a power supply and De8auer is working on his solution but by the time these products are released I may as well wait and see what's next.
 
Solid choice! Especially if you got it at a great price. Should hopefully last longer even than the last card too, which just seems like back luck that it died.
QA/QC went right in the garbage during the pandemic. I've seen more posts on message boards and reddit of higher end cards from both IHVs AIBs produced during that time period going belly up than any other time I can remember.
 
This data just goes to show how irrelevant the whole commento-sphere is. This including not just forum chickens, whose post are read b a handful of people, but also the "big" playas, such as some famous youtubers who spend most of the time declaring how horrible Nvidia is (GN, HU, etc). Well, there's no business like clickbait business, but all their huffin' n puffin doesn't seem to matter a jot.

Of course, people who subscribe to this tunnel-visioned narrative will rationalize these numbers with a cult/fanboism/mindshare (? lol)/ handwave. That's ok, but those who really care about the PC market should perhaps try a little bit harder to analyze things than just screaming "Ngreeeeedia" over and over again.

While brand loyalty, OEM vendors and other such factors surely do matter, there's simply no denying that Nvidia consistently delivers the best options in the GPU space. This doesn't of course mean that all the cards are always brilliant but each gen there's always something at least decent to choose, accompanied by an innovative feature set. And even if you still want to claim that "RT doesn't matter" then you simply cannot do that for upscaling, Reflex and other stuff that AMD just plays catch up to (been doing since G-Sync). These things matter to a lot of people, and this is why somebody like me, who was dyed in RED back in the ATI Rage glory days, now simply chooses to pay a bit more to be on the cutting edge side of things.

On the flipside, how is it that the allegedly white knight that is AMD is floppin again? 9070XT at 600 USD is undeniably a very strong proposition, which is why I got one at launch (just been unlucky to get a dud one), but where can I get one at this price (because only this makes sense), pray tell? How is it that the bigwigs such as Holy Steve, de8rauer et al and their followers will (rightly) castigate Nvidia for the shortage, but not their chosen team? How is it that Nvidia is "holding back gaming" with 8GB cards, but not AMD?

Rhetorical questions, of course :)
 
Elon Musk bought 200,000 5090s for his AI. That is a ton of cards. The D series is created so official sales can go to China and Nvidia has no problem filling the rest of South East Asia with cards for that purpose. It is so insane that PC World did a "Live" event at the newest Microcenter in Santa Clara. They made sure every they got on camera was looking for the same thing a 5090 Founders Edition.

It is too bad though because this propaganda has made talking about PC Gaming desultory. It is to the point where people actual believe that the 4090 and 5090 are the only cards that can do 4K. Of course that is total hog wash and my monitor is 4K but I can post my average FPS and they will still call me a liar.
 
So I looked for the sources, and they, "unexpectedly", reached to Steam hardware survey.
Do you guys know which AMD card is the most used?
Radeon(TM) graphics
Radeon graphics

Steam apparently reports all users with amd igpu and dgpu as only those two gpus above.
And they are combined of 4.5% of users.
Kinda, weird, isn't it? There is no way 4.5% of steam users would use igpu for gaming.

So this report it rigged and has no real value.
AMD could have as much as 12-16% share.
They have sold damn lot of 9070(xt).

And Steam isn't they only platform that is used.
Someone needs to learn the difference between sales and installed base.
 
Someone needs to learn the difference between sales and installed base.
Someone also needs to understand sample size and statistics. Lets say the bug/error in Steam HS reporting iGPU for AMD CPUs is relatively common. Unlikely, since Zen chips had one only Zen 4 onwards and, furthermore, a lot of people know to disable the iGPU in UEFI so no weirdness happens, but let’s go with that. Extrapolating from this that this hurts AMD representation is wild. Do these people just assume that every user affected runs Radeon dGPU? Why? And if not, wouldn’t that affect NV equally if not MORE since we know they have a larger install base by default? Like, I don’t understand the logic here. The HS, even accounting for this bug, is representative enough to be valid data since the sheer pool of users is so enormous.
 
Notice how most of Nvidia's features are software features. AMD's features are mostly hardware features. Yes there's some exceptions here and there.
What did AMD introduce?

EyeFinity for multi monitor setups in 2009. Nvidia later followed with Surround in 2010 but it was not as good.
Mantle in 2013. Eventually morphed into Vulkan and lit fire under MS's ass to release DX12.
HBM in 2015. Yes it did not work out, but at least they tried something new with consumer GPU's. Nvidia only used HBM for workstation cards.
Radeon Chill in 2017 to dynamically adjust framerate based on user input. Nvidia later introduced something similar, but only for laptops.
SAM (ReBAR) for faster VRAM access. Nvidia again followed later with 30 series, but on whitelist basis for games.
DisplayPort 2.1 support in 2022. Arguably they dropped the ball this year by not moving to UHBR20 like Nvidia did.
Ray tracing and G-Sync are at least partially hardware. Mantle is software. AMD was the first with HBM and ReBAR and PCIe 4.0, but none of these things made AMD's GPUs the best at gaming. PCIe 4.0 was released before it became necessary. Most people with Vega graphics cards don't know they have HBM. ReBAR is partly a CPU tech too which I think gave AMD an advantage. Nvidia quickly adopted these things when it made sense.

But DLSS has a huge impact on gaming performance and is very visible to customers. Same with G-Sync and ray tracing to lesser extents.
 
Ray tracing and G-Sync are at least partially hardware. Mantle is software. AMD was the first with HBM and ReBAR and PCIe 4.0, but none of these things made AMD's GPUs the best at gaming. PCIe 4.0 was released before it became necessary. Most people with Vega graphics cards don't know they have HBM. ReBAR is partly a CPU tech too which I think gave AMD an advantage. Nvidia quickly adopted these things when it made sense.

But DLSS has a huge impact on gaming performance and is very visible to customers. Same with G-Sync and ray tracing to lesser extents.
I would argue that Freesync has been the greatest boon for AMD. I noticed you left that one out. Yes it is the same thing as VRR.
 
On the flipside, how is it that the allegedly white knight that is AMD is floppin again? 9070XT at 600 USD is undeniably a very strong proposition
If it kept that price it would instantly kill 3 nvidia gpus, the 5070, the 5070ti and probably the 5080, nobody would be crazy enough to spend ~twice the money for that. But sadly that's not the world we live in and in reality the 9070xt is priced at nvidia -69,99$ so yeah, most people would prefer grabbing a 5070ti instead. Especially with how many driver issues the 9070xt has, just spend the extra 70$ and have your piece of mind.

Ray tracing and G-Sync are at least partially hardware. Mantle is software. AMD was the first with HBM and ReBAR and PCIe 4.0, but none of these things made AMD's GPUs the best at gaming. PCIe 4.0 was released before it became necessary. Most people with Vega graphics cards don't know they have HBM. ReBAR is partly a CPU tech too which I think gave AMD an advantage. Nvidia quickly adopted these things when it made sense.

But DLSS has a huge impact on gaming performance and is very visible to customers. Same with G-Sync and ray tracing to lesser extents.
The gsync sticker made monitors 200$ more expensive, it was an easy skip on my part.
 
Elon Musk bought 200,000 5090s for his AI. That is a ton of cards. The D series is created so official sales can go to China and Nvidia has no problem filling the rest of South East Asia with cards for that purpose. It is so insane that PC World did a "Live" event at the newest Microcenter in Santa Clara. They made sure every they got on camera was looking for the same thing a 5090 Founders Edition.

It is too bad though because this propaganda has made talking about PC Gaming desultory. It is to the point where people actual believe that the 4090 and 5090 are the only cards that can do 4K. Of course that is total hog wash and my monitor is 4K but I can post my average FPS and they will still call me a liar.
The D series are a show to comply with regulations, Jensen is being complicit with 5090's being smuggled to China, and factory sealed AIB cards are mysteriously being swapped out with bags. And the propagandizing of the 5090 is just weird as it was well known that the 9070 and 9070XT were outselling the entire RTX 5000 series at Microcenter, all of the nonsense of marketshare while brand loyalty and mindshare are very strong at keeping customers within the ecosystem of proprietary features.
Well, there's no business like clickbait business, but all their huffin' n puffin doesn't seem to matter a jot.
I'm not going to get into the accusations of clickbait, but any criticism doesn't matter when a company has a near monopoly.
where can I get one at this price (because only this makes sense), pray tell?
It's supply and demand, but no one ever makes that excuse for AMD of course. Even if AMD could double the supply they wouldn't be able to keep up with market demand, and retailers couldn't keep them in stock at MSRP.
That's ok, but those who really care about the PC market
Those who really care about the PC market should be worried about the near monopoly Nvidia has, it isn't just "better" software or features.
How is it that the bigwigs such as Holy Steve, de8rauer et al and their followers will (rightly) castigate Nvidia for the shortage, but not their chosen team? How is it that Nvidia is "holding back gaming" with 8GB cards, but not AMD?
Because Nvidia is the market leader, people should have higher expectations of the one controlling the market.
 
Because Nvidia is the market leader, people should have higher expectations of the one controlling the market.
That's completely backwards. You are basically saying that I should expect nvidia with their 92% marketshare to try as hard as possible to grab the remaining 8%, while AMD will try their hardest to get to 0% marketshare. What kind of logic is that? The sad part is AMD is listening to you and seems to be in a rush to hit that 0%, lol.
 
Someone also needs to understand sample size and statistics. Lets say the bug/error in Steam HS reporting iGPU for AMD CPUs is relatively common. Unlikely, since Zen chips had one only Zen 4 onwards and, furthermore, a lot of people know to disable the iGPU in UEFI so no weirdness happens, but let’s go with that. Extrapolating from this that this hurts AMD representation is wild. Do these people just assume that every user affected runs Radeon dGPU? Why? And if not, wouldn’t that affect NV equally if not MORE since we know they have a larger install base by default? Like, I don’t understand the logic here. The HS, even accounting for this bug, is representative enough to be valid data since the sheer pool of users is so enormous.

Even rudimentary understanding of sampling and statistics (full disclosure: that's where I'm at, and that's being generous) is pretty rare. Combine that with misunderstanding about suitability, applicability, and/or context of data, plus the human tendency to see what we want to see... Well, I'm not surprised at all how much hate the SHWS gets.

Anyway, for anyone interested, the Steam Hardware Survey is useful for mostly one thing: trend of installed base over time. Due to randomized sampling and its opt-in nature, it is absolutely not an authoritative source. Single-month snapshots are just this side of meaningless, as month-to-month fluctuations are very common due to statistical variance. It's also, in my previously-admitted limited understanding, a lagging indicator. Purchase trends may not show up in any meaningful way until several months after the fact, or longer.

Looked at another way, it's kind of like doing forestry from satellite images. You pick a coordinate at random each month and analyze what's visible in the canopy in, say, a 5km radius. You get a broad idea of the general health of the forest (within certain assumptions), but not a detailed breakdown of species in other areas, nor any information on what happening on the ground. Even the information you get in that single image isn't worth much until you compare it against the data collected in previous analyses. Clearly, this is not enough to make any real decisions if you're the entity in charge of maintaining the forest, and you'll have other sources of data if you are. But if you're an interested amateur with only that satellite data available to you? Well, you make do with what you've got.

When it comes to the SHWS, we are that amateur, and must take care not to assign significance to the data that isn't warranted, or to dismiss it outright because it doesn't show us what we expect or think we should see.

The D series are a show to comply with regulations, Jensen is being complicit with 5090's being smuggled to China, and factory sealed AIB cards are mysteriously being swapped out with bags. And the propagandizing of the 5090 is just weird as it was well known that the 9070 and 9070XT were outselling the entire RTX 5000 series at Microcenter, all of the nonsense of marketshare while brand loyalty and mindshare are very strong at keeping customers within the ecosystem of proprietary features.

To start, JHH leads Nvidia, he isn't Nvidia. But let's say he has full control over all NV activities at every level. I find it highly unlikely that if the prospect of that activity even made it to his desk that he'd sign off on it. What reason would he have? Nvidia gets paid the same no matter where the cards end up, and the potential PR black eye just isn't worth it. NV's not shipping the cards out anyway: The AIB is, and I'd put money down that the bag swap starts and ends with them.
 
I would argue that Freesync has been the greatest boon for AMD. I noticed you left that one out. Yes it is the same thing as VRR.
I didn't mention it. But I bought a FreeSync monitor 10 years ago and bought an AMD graphics card a year later specifically because it supported FreeSync and I believed at the time Nvidia and Intel would one day support FreeSync as well. (And my next two primary GPUs were AMD for other reasons.)

But my cheap and early adopter FreeSync monitor only offers VRR from 45Hz to 65Hz. It's rarely helpful. AMD introduced FreeSync Premium more recently to address this but should've done that from the start, as Nvidia did with strict G-Sync rules. (I'm okay with FreeSync being offered in poor implementations like mine but clear marketing for good implementations is essential.)

The reason I didn't mention FreeSync is becuase it was a response to the idea of VRR that Nvidia came up with and because my vendor-agnostic buying priorities seem to be shared by no one else and because G-Sync generally remained the best option for people who prioritized VRR and because FreeSync didn't seem very well marketed. Like I wanted that wind turbine demo AMD made but it was only shown to the press, as if the rest of us weren't allowed to see clear demos of how helpful VRR is.
 
I said it was the highest selling amd card, and you said that's false because it was the most popular card in the series. Sorry, what? So it was the highest selling card, just like I claimed. So clearly, a big % buys higher end gpus.
So when i proved the Nvidia's 70 class and up do not account for 50% of the market you pivoted to a single AMD card in a single series to prove your point?
7900 XTX is an outlier. In 6000 series the most popular card was 6600. Followed by 6700 and 6800. 6900 was below all of them.
Im not in disagreement, if you don't think the 70ti is worth 70€ extra more power to you and all that, im just saying it's not as bad a deal as people want you to think it is. It's 15% faster in RT and god knows how much faster in PT so it costing 9% more money is completely reasonable. Now you shouldn't pay 70€ extra if you are not going to be using RT / PT obviously, but that doesn't make the 70ti overpriced. It has to be more expensive than the 9070xt since it's the better card.
Im sure many people are buying 5070 Ti for running PT in the handful of games where it's available. /s

I went with 9070 XT instead of 5070 Ti for the following reasons: standard power cable. Faster boot time with AMD card. Much better control panel. Lower price (roughly 150€ at the time of purchase).

And going by Nvidia's past decisions i figured they will screw me over with some new feature when 60 series launched like they have screwed over every RTX owner since 20 series. That one did not get ReBAR or FG despite being fully capable of both. I was a 2080 Ti owner myself so that stinged me. 30 series also did not get FG. 40 series was denied MFG. God knows what 50 series will be denied when 60 series launches.

If Nvidia at least had a solid reason then i could understand it, but they didn't. Modders proved it by getting ReBAR working on 20 series. AMD proved it by giving 20 and 30 series owners FG and Lossless Scaling proved that with MFG on older cards. Meanwhile i have not yet seen anyone getting FSR 4 running on prior generation cards. There's also the issue of chip cutting. With Nvidia a user can buy a 5090 for 2000+ and they will still not get the fully enabled chip. And Nvidia could screw them over in the same series by releasing a more enabled chip.
With AMD that is not the case and in every RDNA generation the top chip has been a fully enabled die.

I have owned exactly an equal amount of both AMD and Nvidia cards over the years. Each time selecting one that made sense to my needs and each time using it for 4-5 years. Im not buying 5070 Ti for PT because by the time PT becomes widespread and easy enough to run there are cheaper and faster cards available to do that. The same reason i did not buy 2080 Ti for RT because i knew by the time RT became widespread my card would not be able to handle it anyway. I bought it for VRAM instead of the newer 3070 at the time and i was right. Besides i came from GTX 1080 so i already had 8GB of VRAM. It made zero sense to move to another 8GB card.
Ray tracing and G-Sync are at least partially hardware. Mantle is software. AMD was the first with HBM and ReBAR and PCIe 4.0, but none of these things made AMD's GPUs the best at gaming. PCIe 4.0 was released before it became necessary. Most people with Vega graphics cards don't know they have HBM. ReBAR is partly a CPU tech too which I think gave AMD an advantage. Nvidia quickly adopted these things when it made sense.

But DLSS has a huge impact on gaming performance and is very visible to customers. Same with G-Sync and ray tracing to lesser extents.
I never said the features AMD came up with first made their GPU's the best at gaming. I simply pointed out that list to people who constantly claim that Nvidia was always first with everything and AMD never introduced something first.
I would argue that Freesync has been the greatest boon for AMD. I noticed you left that one out. Yes it is the same thing as VRR.
Technically FreeSync came after G-Sync, but it did democratize zero cost VRR for the masses. Before this you had to pay a $200 tax. I know because i had such a monitor with a hardware G-Sync module that i used for 9 years.
Especially with how many driver issues the 9070xt has, just spend the extra 70$ and have your piece of mind.
What many driver issues? The only driver issue i've had with 9070 XT is that the latest 25.6.1 requires HwInfo64 and possibly GPU-Z updates to display all sensor values properly. Looking at TPU forums i dont see a massive 12+ page thread about AMD driver issues either.

I suppose you could say i have a sample size of one, but i haven't seen anything major reported with RDNA 4 and drivers.
Meanwhile you suggest 50 series for peace of mind. The same 50 series whose drivers are so bad that a major tech reviewer had to do a dedicated video on it. Even Digital Foundry mentioned it and i've heard from laptop reviewers that most 50 series laptops are extremely buggy due to drivers. Especially 5060.
The gsync sticker made monitors 200$ more expensive, it was an easy skip on my part.
Well to be honest it did at least introduce something to the market that was missing before.
Arguably sticking an expensive FPGA that in several cases required active cooling into a monitor was not the greatest idea.

Not to mention the vendor lock-in. First gen G-Sync modules did not have user updateable firmware and if a user used AMD card with a monitor that had such a module they completely lost VRR.
 
So when i proved the Nvidia's 70 class and up do not account for 50% of the market you pivoted to a single AMD card in a single series to prove your point?
7900 XTX is an outlier. In 6000 series the most popular card was 6600. Followed by 6700 and 6800. 6900 was below all of them.
So by not offering a "7900xtx" back in 2018 they don't even have the chance to make it a best seller. What are we arguing here?

Im sure many people are buying 5070 Ti for running PT in the handful of games where it's available. /s
The majority of games that you buy a new GPU for have RT or PT man. I don't get this argument you are making. In general, you don't buy a new GPU so you can play every game - most games already worked in your old card. You are buying a new GPU to play those handful of heavy titles that don't work on your old card.

I went with 9070 XT instead of 5070 Ti for the following reasons: standard power cable. Faster boot time with AMD card. Much better control panel. Lower price (roughly 150€ at the time of purchase).
Good for you but all of those are personal preferences. Someone else might prefer nvidias control panel and the new 12vh cable. The difference in RT performance isn't a preference, it's a reality.

And going by Nvidia's past decisions i figured they will screw me over with some new feature when 60 series launched like they have screwed over every RTX owner since 20 series. That one did not get ReBAR or FG despite being fully capable of both. I was a 2080 Ti owner myself so that stinged me. 30 series also did not get FG. 40 series was denied MFG. God knows what 50 series will be denied when 60 series launches.
Come on, SURELY you must realize how ironic that sounds when AMD left all of their old cards with the horrible outdated FSR - even their 1k$ cards, while they are giving the new FSR4 to cards that cost 300$. Especially considering nvidia gave DLSS4 to 7 year old cards, including the 20xx gen that you mentioned. If this isn't a biased as hell take then what is?

Regarding the driver issues, there is a huge megathread about it on amd's reddit. You can go take a look.
 
You can just look at the AMD CPU products. They had the better product for some years now, but they just managed around 30% market share with continued Intel fuckups.

Brand loyalty or marketing.

Huh? been having the steam survey invite for consecutive years here since the 2000s. You must not get on steam regularly then?

Your argument is not valid.

I played for several years "star conflict" with the gnu userspace and linux kernel and never saw that steam survey. I also played a bit in windows star conflict and never saw that survey.

I'm one of the few older pc players who have never spend 5 €, 5 us dolla or 5 canadian bever bucks on steam. Maybe that's why i am not able to vote in the past. Fanatical or whatever that site is called gives away regulary games but I am not allowed to have one, because I never spend any cash on it.

I saw the steam survey first time a few weeks ago with my 4th refurbished notebook with windows 11 pro. I just download games on that box but I never played, but I got the steam survey. Makes sense, right?
 
So by not offering a "7900xtx" back in 2018 they don't even have the chance to make it a best seller. What are we arguing here?
Now you jump back to 2018 again. Make up your mind. You're arguing that most people buy expensive GPU's based on one model sample, from one generation. When i proved your initial assertion wrong you started with this XTX talk. No. Most people dont buy expensive GPU's.
End of discussion.
The majority of games that you buy a new GPU for have RT or PT man. I don't get this argument you are making. In general, you don't buy a new GPU so you can play every game - most games already worked in your old card. You are buying a new GPU to play those handful of heavy titles that don't work on your old card.
No they dont. Very few games have PT. Yes more games have RT but even that is not universal.
You are also wrong on why i buy a new GPU. It is exactly to play every game. Even the ones that worked on my old card.
The handful on newer and heavier games worked on my old card. Just at 1440p 60. I wanted 1440p 120 and i got that with my new card.
I didn't buy this card for RT or much less PT (it would have been the same if i bought Nvidia). Not even for upscaling. I bought it because i wanted higher framerates in all games. I wanted near 100% perf increase over my old card and i wanted more VRAM. And i wanted it it cheaper than the 900€ Nvidia was offering at the time.
Good for you but all of those are personal preferences. Someone else might prefer nvidias control panel and the new 12vh cable. The difference in RT performance isn't a preference, it's a reality.
I dont know anyone who has used AMD and that prefers Nvidia's control panel after that. The difference in price is also not a preference. It's reality.
Come on, SURELY you must realize how ironic that sounds when AMD left all of their old cards with the horrible outdated FSR - even their 1k$ cards, while they are giving the new FSR4 to cards that cost 300$. Especially considering nvidia gave DLSS4 to 7 year old cards, including the 20xx gen that you mentioned. If this isn't a biased as hell take then what is?
Like i said AMD properly justified it. RDNA 3 and older are unable to run FP8 that FSR 4 users with acceptable speed. It's a hardware issue, not an arbitrary software restriction. Not to mention that AMD did this now, once. Nvidia has done this for the past 5 years. Am i supposed to pat Nvidia on the back for once doing the right thing and giving older cards DLSS 4? If a company keeps screwing over it's buyers then they dont get a good mark from me when they finally for once do one thing that benefits their previous series buyers. DLSS 4 does not make up for the lack of ReBAR, FG or MFG that the older cards clearly are able to run.
Regarding the driver issues, there is a huge megathread about it on amd's reddit. You can go take a look.
Please link it as im unable to find it. The best i can see is tech support megathread started 2 months ago that has 390 posts. Nvidia has similar megathreads.
This does not mean every single post on those threads is driver related.
 
No they dont. Very few games have PT. Yes more games have RT but even that is not universal.
You are also wrong on why i buy a new GPU. It is exactly to play every game. Even the ones that worked on my old card.
The handful on newer and heavier games worked on my old card. Just at 1440p 60. I wanted 1440p 120 and i got that with my new card.
I didn't buy this card for RT or much less PT (it would have been the same if i bought Nvidia). Not even for upscaling. I bought it because i wanted higher framerates in all games. I wanted near 100% perf increase over my old card and i wanted more VRAM. And i wanted it it cheaper than the 900€ Nvidia was offering at the time.
Why did you turn this into "you"? Nobody cares my man. You don't represent the whole market. People buy new gpus to play the heavy games, not the games their old gpu could play. Plenty of those heavy AAA games have RT or PT.
I dont know anyone who has used AMD and that prefers Nvidia's control panel after that. The difference in price is also not a preference. It's reality.
Well you just met one. I prefer nvidias old control panel over both nvidias new and amd adrenaline

Like i said AMD properly justified it. RDNA 3 and older are unable to run FP8 that FSR 4 users with acceptable speed. It's a hardware issue, not an arbitrary software restriction. Not to mention that AMD did this now, once. Nvidia has done this for the past 5 years. Am i supposed to pat Nvidia on the back for once doing the right thing and giving older cards DLSS 4? If a company keeps screwing over it's buyers then they dont get a good mark from me when they finally for once do one thing that benefits their previous series buyers. DLSS 4 does not make up for the lack of ReBAR, FG or MFG that the older cards clearly are able to run.
Nvidia also properly justified it. You weren't convinced by nvidia but was convinced about amd cause of inherent bias. You are not supposed to do anything, im just calling you out on your really ironic statement about being afraid youll be cut off some new nvidia technology, when that's exactly what AMD just did. Not to mention their driver support which is horrendous cutting off support for stuff that's still ON the market sold as brand new (the whole vega igpu lineup, lol!).

And like what goddamn difference does it make whether or not it's justified? Say FSR 5 doesn't work on your 9070xt for some justified reason, so freaking what? You still got the short end of the stick. They started developing a technology that couldn't work on their brand new RDNA 3 cards that costs upwards of 1.1k$ but it's all cool, no biggie, lol :D

Please link it as im unable to find it. The best i can see is tech support megathread started 2 months ago that has 390 posts. Nvidia has similar megathreads.
This does not mean every single post on those threads is driver related.
I'll link it, it has over 1.2k comments, but it won't change your mind anyways.

 
Back
Top