• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

The not so ultimate 2025H1 guide on GPU purchase

Joined
Feb 24, 2023
Messages
4,145 (4.96/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
So, dust settles and all GPUs supposed to matter had been released. This guide is supposed to help gamers more than anyone else. Professional users already know AMD don't do it right, at least supposed to be aware of that. Let's go.

DISCLAIMER:
The author of this post is an active AMD GPU user that knows what he is talking about when criticising AMD products.
If you feel like I'm pro-nVidia or anti-AMD you're wrong when you think the former and right when you think the latter.

ADVICE No. 0: DON'T BUY A GPU UNLESS ABSOLUTELY NEED TO. BOTH AMD AND NVIDIA DESERVE A MASSIVE BOYCOTT.

General thoughts:


I'm starting from the very bottom. RX 9060 series VS RTX 5060 series contest.
Both these series are one of the bollocks of all times. Almost negative improvements compared to previous gen clown fiesta aka RTX 4060 Ti. A 16-gig version of 5060 Ti might've been a decent option but at its burglary of a price it's a straight up no. Ignore 8-gig models, they won't last. 9060 XT 16 GB is technically a reasonable value product but if we take it more seriously it doesn't shift a gear. It loses to 5060 Ti in game stability and upscaling support. The latter is devastating because at this performance level and at this game system requirements, upscaling is unfortunately a must. And FSR4 is far from being everywhere.
A used 4070 could be a healthy alternative to either of them. If you can't trust your sellers or if the prices are out of whack on this one... Well. Whatever xx60 GPU of this gen you pick it will be an overpriced piece of rubbish. Don't come and say you haven't been warned.

Mid-range. RX nada VS RTX 5070 VS used 4070 Super/Ti.
If you care for 32-bit CUDA stuff the answer is obvious for you. If you care for value then pick whatever is cheaper. RDNA3 GPUs are irrelevant at this point. 9070 non-XT might crawl into this price range and if it does and if you're dead sure its half-baked state doesn't scare you go for it. I wouldn't do that because who knows how long it'll take AMD to fix game crashing (which is rare tbf), poor optimisation (not so rare but still rare) and missing native FSR4 support (vast majority of games).

Upper mid-range. RX 9070 series VS 5070 Ti VS 4070 Ti Super VS 4080 series.
Same again, if you care for 32-bit CUDA then obviously it's either 4070 Ti Super or 4080, depending on how much you're ready to invest. If you don't... 9070 XT offers about as much raw performance as 4070 Ti Super does, sometimes better, sometimes worse. But it offers no CUDA and native FSR4 is almost non-existent and who knows when AMD will fix this. Want to risk it, go ahead but let's say I warned you. With 5070 Ti... not a lot is right with it, it's a seriously overpriced GPU that has its own problems but if 32-bit CUDA support isn't a concern it's a current best buy. I hate it but what can I do.

Enthusiast range. Only nVidia cards populate this area. 4090 if you care for 32-bit CUDA and/or having "only" 16 GB VRAM is inconvenient for you. 5080 otherwise. 5090... if you can afford it why are you reading it?

VRAM concerns:


8 GB VRAM is a no go in 2025. At least if you're paying more than two C-notes for a GPU. There is a lot YouTube materials showcasing how problematic this VRAM size is.
12 GB VRAM is passable. Nothing burger, I wouldn't say it's enough for exceptional gaming experience whatsoever. But it's enough for reasonable gameplay. Will do if you don't do path tracing and ain't gonna crank everything up to the max. 4K montior users should probably avoid GPUs this short on VRAM.
16 GB VRAM is totally fine for now. There's maybe a couple games where you can meaningfully use more but these are complete outliers and they mostly feel worse on AMD cards as these manage VRAM shortages a little worse than nVidia counterparts. See Marvel Spider-Man 2 at max settings for example. 16-gigabyte 9070 XT runs out of VRAM but at the same time, 16-gigabyte RTX 5070 Ti feels perfectly fine. There might be other cases of such thing.
20+ GB VRAM is an overkill for actual gaming. If you get a GPU this rich in VRAM you're most likely doomed to run out of calculating power before running out of VRAM.

RDNA2/3 and Ampere GPUs


These, frankly, are best avoided. RDNA3 cards might pack some punch in pure raster but they don't support FSR4, they lack RT performance, they fall off in UE5 games and things will only get worse for them. FSR3 upscaling is far from acceptable (at least to my eyes) and it will never evolve to be something good. RDNA2 is just a huge nope, unless you paid like two hamburgers for a card. Ampere GPUs are much better value but they have very limited VRAM (10 GB for 3080, for example, doesn't feel right; and 8 GB 3060 Ti alongside 3070 series is just THE insult). If you are completely sure in your actions and don't fear wattage spikes (these are horrible in both Ampere and RDNA2 cards) and also don't mind the cards' age and mining status then a 3080 series GPU might be a very good purchase. Check if your PSU can handle that before pulling the trigger anyway.

Older GPUs are generally only okay if bought for a crumb of lunch money and slotted in a system you don't really care about. But 2080 Ti is a healthy alternative to 5060 and 9060 cards, provided it's cheaper, the seller is reputable, and your PSU can handle it.


What about Intel?


Intel dGPUs are a good addition to the mix as a concept but their actual performance is limited, and you're mostly investing in a cat in a bag buying such GPUs. Absolutely avoid them if you don't have a CPU with outstanding single thread performance (such as i5-13600K or Ryzen 5 7600, or better).

What about PCI-e lane count?

x8 cards will be fine if your system supports PCI-e 4.0 or higher. On 3.0 systems, you should expect significantly worse performance in cases when VRAM is fully utilised. It's a major issue for 8-gigabyte 5060 cards. Once again, if your system can go 4.0 then you don't have to worry about it. x16 GPUs are generally fine on whatever supports 3.0.

What about power efficiency?


The difference is minimal if we talk RDNA4 VS Ada VS Blackwell. Pick your poison.

What about 12VHPWR?


I hate it and I don't recommend using it at a full throttle. Do yourself a favour and adhere to 120 W per 8-pin wattage formula. Running a 5090 at 575 W isn't safe. 480 W must be fine. Make sure nothing bends and everything stays put.


What about frame generation and shiet?


No comment. I can't be objective on this matter.
 
Well not buying any gpu isn't ultra realistic. If you are stuck with something low range that is >5 years old (so basically a 5700 / 5700xt / 2060 / 2070) then the 9060xt is okayish. Anything newer than that and it's a very meh proposition.
 
A pretty good guide! Though I should note you might have mistyped 32-bit PhysX as 32-bit CUDA. The 32-bit PhysX is no longer supported on the Blackwell cards, but CUDA definitely is. CUDA is a must for some AI and a few productivity workflows, but it doesn't matter for gaming.

A decent Ryzen 5000 series (which almost any AM4 mobo owner can upgrade to) should be able to run the Intel Arc B580 without any major issues. The performance degradation would be very small. If anyone does have a Ryzen 3000, 2000 or even 1000 series CPU, then I'd say a CPU upgrade is definitely in order alongside any contemporary graphics card.
 
Though I should note you might have mistyped 32-bit PhysX as 32-bit CUDA
PhysX is a part of CUDA. They removed all 32-bit CUDA instructions, including PhysX.
If anyone does have a Ryzen 3000, 2000 or even 1000 series CPU, then I'd say a CPU upgrade is definitely in order alongside any contemporary graphics card.
While true, it's way worse on Intel GPUs than it is on green or red ones. 3700X, for example, is fine with a 4060 Ti 16 GB (my acquaintance has a PC like this), whereas with B580, the performance would've tanked obscenely. Anyway, that guy is saving up for a better PC, he wants to play at 4K...
 
PhysX is a part of CUDA. They removed all 32-bit CUDA instructions, including PhysX.

While true, it's way worse on Intel GPUs than it is on green or red ones. 3700X, for example, is fine with a 4060 Ti 16 GB (my acquaintance has a PC like this), whereas with B580, the performance would've tanked obscenely. Anyway, that guy is saving up for a better PC, he wants to play at 4K...

Ah, I see. You are correct about CUDA. It's also true that lower-end nVidia or AMD cards are a better match for old processors than the Intel release. However, I'd still advise anyone still using an old processor to consider something like a 5700X3D - should be compatible with the same motherboard and RAM with just a BIOS update and they only cost $150-200.
 
However, I'd still advise anyone still using an old processor to consider something like a 5700X3D
Depends on the tasks and on the market. In some countries X3D chips are too expensive and some people need more multi-threaded CPU performance rather than gaming performance so they'd buy a 5950X instead.
 
So, dust settles and all GPUs supposed to matter had been released. This guide is supposed to help gamers more than anyone else. Professional users already know AMD don't do it right, at least supposed to be aware of that. Let's go.

DISCLAIMER:
The author of this post is an active AMD GPU user that knows what he is talking about when criticising AMD products.
If you feel like I'm pro-nVidia or anti-AMD you're wrong when you think the former and right when you think the latter.

ADVICE No. 0: DON'T BUY A GPU UNLESS ABSOLUTELY NEED TO. BOTH AMD AND NVIDIA DESERVE A MASSIVE BOYCOTT.

General thoughts:

I'm starting from the very bottom. RX 9060 series VS RTX 5060 series contest.
Both these series are one of the bollocks of all times. Almost negative improvements compared to previous gen clown fiesta aka RTX 4060 Ti. A 16-gig version of 5060 Ti might've been a decent option but at its burglary of a price it's a straight up no. Ignore 8-gig models, they won't last. 9060 XT 16 GB is technically a reasonable value product but if we take it more seriously it doesn't shift a gear. It loses to 5060 Ti in game stability and upscaling support. The latter is devastating because at this performance level and at this game system requirements, upscaling is unfortunately a must. And FSR4 is far from being everywhere.
A used 4070 could be a healthy alternative to either of them. If you can't trust your sellers or if the prices are out of whack on this one... Well. Whatever xx60 GPU of this gen you pick it will be an overpriced piece of rubbish. Don't come and say you haven't been warned.

Mid-range. RX nada VS RTX 5070 VS used 4070 Super/Ti.
If you care for 32-bit CUDA stuff the answer is obvious for you. If you care for value then pick whatever is cheaper. RDNA3 GPUs are irrelevant at this point. 9070 non-XT might crawl into this price range and if it does and if you're dead sure its half-baked state doesn't scare you go for it. I wouldn't do that because who knows how long it'll take AMD to fix game crashing (which is rare tbf), poor optimisation (not so rare but still rare) and missing native FSR4 support (vast majority of games).

Upper mid-range. RX 9070 series VS 5070 Ti VS 4070 Ti Super VS 4080 series.
Same again, if you care for 32-bit CUDA then obviously it's either 4070 Ti Super or 4080, depending on how much you're ready to invest. If you don't... 9070 XT offers about as much raw performance as 4070 Ti Super does, sometimes better, sometimes worse. But it offers no CUDA and native FSR4 is almost non-existent and who knows when AMD will fix this. Want to risk it, go ahead but let's say I warned you. With 5070 Ti... not a lot is right with it, it's a seriously overpriced GPU that has its own problems but if 32-bit CUDA support isn't a concern it's a current best buy. I hate it but what can I do.

Enthusiast range. Only nVidia cards populate this area. 4090 if you care for 32-bit CUDA and/or having "only" 16 GB VRAM is inconvenient for you. 5080 otherwise. 5090... if you can afford it why are you reading it?

VRAM concerns:

8 GB VRAM is a no go in 2025. At least if you're paying more than two C-notes for a GPU. There is a lot YouTube materials showcasing how problematic this VRAM size is.
12 GB VRAM is passable. Nothing burger, I wouldn't say it's enough for exceptional gaming experience whatsoever. But it's enough for reasonable gameplay. Will do if you don't do path tracing and ain't gonna crank everything up to the max. 4K montior users should probably avoid GPUs this short on VRAM.
16 GB VRAM is totally fine for now. There's maybe a couple games where you can meaningfully use more but these are complete outliers and they mostly feel worse on AMD cards as these manage VRAM shortages a little worse than nVidia counterparts. See Marvel Spider-Man 2 at max settings for example. 16-gigabyte 9070 XT runs out of VRAM but at the same time, 16-gigabyte RTX 5070 Ti feels perfectly fine. There might be other cases of such thing.
20+ GB VRAM is an overkill for actual gaming. If you get a GPU this rich in VRAM you're most likely doomed to run out of calculating power before running out of VRAM.

RDNA2/3 and Ampere GPUs

These, frankly, are best avoided. RDNA3 cards might pack some punch in pure raster but they don't support FSR4, they lack RT performance, they fall off in UE5 games and things will only get worse for them. FSR3 upscaling is far from acceptable (at least to my eyes) and it will never evolve to be something good. RDNA2 is just a huge nope, unless you paid like two hamburgers for a card. Ampere GPUs are much better value but they have very limited VRAM (10 GB for 3080, for example, doesn't feel right; and 8 GB 3060 Ti alongside 3070 series is just THE insult). If you are completely sure in your actions and don't fear wattage spikes (these are horrible in both Ampere and RDNA2 cards) and also don't mind the cards' age and mining status then a 3080 series GPU might be a very good purchase. Check if your PSU can handle that before pulling the trigger anyway.

Older GPUs are generally only okay if bought for a crumb of lunch money and slotted in a system you don't really care about. But 2080 Ti is a healthy alternative to 5060 and 9060 cards, provided it's cheaper, the seller is reputable, and your PSU can handle it.


What about Intel?

Intel dGPUs are a good addition to the mix as a concept but their actual performance is limited, and you're mostly investing in a cat in a bag buying such GPUs. Absolutely avoid them if you don't have a CPU with outstanding single thread performance (such as i5-13600K or Ryzen 5 7600, or better).

What about PCI-e lane count?

x8 cards will be fine if your system supports PCI-e 4.0 or higher. On 3.0 systems, you should expect significantly worse performance in cases when VRAM is fully utilised. It's a major issue for 8-gigabyte 5060 cards. Once again, if your system can go 4.0 then you don't have to worry about it. x16 GPUs are generally fine on whatever supports 3.0.

What about power efficiency?

The difference is minimal if we talk RDNA4 VS Ada VS Blackwell. Pick your poison.

What about 12VHPWR?

I hate it and I don't recommend using it at a full throttle. Do yourself a favour and adhere to 120 W per 8-pin wattage formula. Running a 5090 at 575 W isn't safe. 480 W must be fine. Make sure nothing bends and everything stays put.

What about frame generation and shiet?

No comment. I can't be objective on this matter.
So, let me get this straight, this is a so called "guide" that instead of guiding you to a wise choice, tells you at the start that ALL GPUs from AMD and Nvidia should be boycotted, and Intel's GPUs aren't worth buying?

That pretty much makes reading all the rest pointless, especially when the disclaimer is in massive caps! :rolleyes:

I mean, I get it, the GPU market is a real crap show lately, especially with AMD bowing out of the high end of it. I just don't see that threads like this are helping matters on that front.
 
to make it short
the only viable brand new lower mid range GPU is rx 9060xt 16gb (but only if you can get it at (or very close to) its MSRP price)
the only viable upper mid range card is rtx 5070 (at least in my country) because it can be found for 600€ , whereas even the cheapest model of 9070 costs 80€ more at the moment .
the only viable high end card is ... surprise ... rtx 5090 ...

when it comes to second hand market :
2080Ti is a good alternative to rtx 5060/9060 8gb
3080 12gb/3080Ti/4070 are good alternatives to 5060Ti
i would personally avoid all older AMD cards unless you can get it dirt cheap ...
 
So, let me get this straight, this is a so called "guide" that instead of guiding you to a wise choice, tells you at the start that ALL GPUs from AMD and Nvidia should be boycotted, and Intel's GPUs aren't worth buying?
Wrong in the sense it's not the only thing it does. That longread going thereafter is an actual guide. I pretty much explained what to do if you actually need or at least want a new GPU bad so I covered quite a lot of possible scenarios.
That pretty much makes reading all the rest pointless, especially when the disclaimer is in massive caps!
This disclaimer is built on my experience with RTX 3060 and RX 6700 XT. The latter surely has a bunch more calculating power for pure raster stuff but the difference is nigh negligible in the modern UE5 titles, DLSS4 is so far ahead you can easily compare DLSS TM Performance to FSR2/3 Native and see the latter losing in 19 scenarios outta 20 (at 1440p). Which means 3060 is effectively FASTER than the more expensive 6700 XT despite being a fairly underloved GPU built by so-called nGreedia on an inferior node. That's why I'm anti-AMD. They make false claims and unreliable products (not in the sense they break or something but in the sense you can't expect them to age better than milk despite some killer feature like significantly better VRAM amount compared to nVidia price-sakes being present), whereas even older nVidia GPUs get some love over time (RTX 2000 series still gets feature updates despite being older than RDNA2/3 which are effectively going to be abandonware).

I would've loved to hold AMD in high regards. I would've loved for them to succeed. I would've loved for the market to become full with products actively trying to beat each other. But no, there's unfortunately none of that. They wronged us.
 
Last edited by a moderator:
Back
Top