I said it was the highest selling amd card, and you said that's false because it was the most popular card in the series. Sorry, what? So it was the highest selling card, just like I claimed. So clearly, a big % buys higher end gpus.
So when i proved the Nvidia's 70 class and up do not account for 50% of the market you pivoted to a single AMD card in a single series to prove your point?
7900 XTX is an outlier. In 6000 series the most popular card was 6600. Followed by 6700 and 6800. 6900 was below all of them.
Im not in disagreement, if you don't think the 70ti is worth 70€ extra more power to you and all that, im just saying it's not as bad a deal as people want you to think it is. It's 15% faster in RT and god knows how much faster in PT so it costing 9% more money is completely reasonable. Now you shouldn't pay 70€ extra if you are not going to be using RT / PT obviously, but that doesn't make the 70ti overpriced. It has to be more expensive than the 9070xt since it's the better card.
Im sure many people are buying 5070 Ti for running PT in the handful of games where it's available. /s
I went with 9070 XT instead of 5070 Ti for the following reasons: standard power cable. Faster boot time with AMD card. Much better control panel. Lower price (roughly 150€ at the time of purchase).
And going by Nvidia's past decisions i figured they will screw me over with some new feature when 60 series launched like they have screwed over every RTX owner since 20 series. That one did not get ReBAR or FG despite being fully capable of both. I was a 2080 Ti owner myself so that stinged me. 30 series also did not get FG. 40 series was denied MFG. God knows what 50 series will be denied when 60 series launches.
If Nvidia at least had a solid reason then i could understand it, but they didn't. Modders proved it by getting ReBAR working on 20 series. AMD proved it by giving 20 and 30 series owners FG and Lossless Scaling proved that with MFG on older cards. Meanwhile i have not yet seen anyone getting FSR 4 running on prior generation cards. There's also the issue of chip cutting. With Nvidia a user can buy a 5090 for 2000+ and they will still not get the fully enabled chip. And Nvidia could screw them over in the same series by releasing a more enabled chip.
With AMD that is not the case and in every RDNA generation the top chip has been a fully enabled die.
I have owned exactly an equal amount of both AMD and Nvidia cards over the years. Each time selecting one that made sense to my needs and each time using it for 4-5 years. Im not buying 5070 Ti for PT because by the time PT becomes widespread and easy enough to run there are cheaper and faster cards available to do that. The same reason i did not buy 2080 Ti for RT because i knew by the time RT became widespread my card would not be able to handle it anyway. I bought it for VRAM instead of the newer 3070 at the time and i was right. Besides i came from GTX 1080 so i already had 8GB of VRAM. It made zero sense to move to another 8GB card.
Ray tracing and G-Sync are at least partially hardware. Mantle is software. AMD was the first with HBM and ReBAR and PCIe 4.0, but none of these things made AMD's GPUs the best at gaming. PCIe 4.0 was released before it became necessary. Most people with Vega graphics cards don't know they have HBM. ReBAR is partly a CPU tech too which I think gave AMD an advantage. Nvidia quickly adopted these things when it made sense.
But DLSS has a huge impact on gaming performance and is very visible to customers. Same with G-Sync and ray tracing to lesser extents.
I never said the features AMD came up with first made their GPU's the best at gaming. I simply pointed out that list to people who constantly claim that Nvidia was always first with everything and AMD never introduced something first.
I would argue that Freesync has been the greatest boon for AMD. I noticed you left that one out. Yes it is the same thing as VRR.
Technically FreeSync came after G-Sync, but it did democratize zero cost VRR for the masses. Before this you had to pay a $200 tax. I know because i had such a monitor with a hardware G-Sync module that i used for 9 years.
Especially with how many driver issues the 9070xt has, just spend the extra 70$ and have your piece of mind.
What many driver issues? The only driver issue i've had with 9070 XT is that the latest 25.6.1 requires HwInfo64 and possibly GPU-Z updates to display all sensor values properly. Looking at TPU forums i dont see a massive 12+ page thread about AMD driver issues either.
I suppose you could say i have a sample size of one, but i haven't seen anything major reported with RDNA 4 and drivers.
Meanwhile you suggest 50 series for peace of mind. The same 50 series whose drivers are so bad that a major tech reviewer had to do a dedicated video on it. Even Digital Foundry mentioned it and i've heard from laptop reviewers that most 50 series laptops are extremely buggy due to drivers. Especially 5060.
The gsync sticker made monitors 200$ more expensive, it was an easy skip on my part.
Well to be honest it did at least introduce something to the market that was missing before.
Arguably sticking an expensive FPGA that in several cases required active cooling into a monitor was not the greatest idea.
Not to mention the vendor lock-in. First gen G-Sync modules did not have user updateable firmware and if a user used AMD card with a monitor that had such a module they completely lost VRR.