• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU Test System Update for 2025

seems like a waste of time imo, he already has enough on his plate. i can't imagine testing that many gpu's, sounds like a nightmare honestly. its a lot of work
I can imagine him buying a used LTO tape library robot and making an automatic GPU swapper out of it. And he has automated almost everything else already. 36 cards is probably as many as the robot can handle.
 
seems like a waste of time imo, he already has enough on his plate. i can't imagine testing that many gpu's, sounds like a nightmare honestly. its a lot of work
Nah it’s the same time to test one 3060 ti, just a different version, if available.
 
So we will see charts that are 90% green and the rest, just like Nvidia's market share.
2025, a great year to know which Nvidia card to buy.
Hope I'm wrong but RIP AMD and Intel.
Its been this way for 20+ years.
 
Its been this way for 20+ years.

Eh I'd say 20 years is pushing it 10 years though sure

7970 era and 290X era AMD had some compelling high ish end stuff.
 
>No plans to add Call of Duty, because of the always-online nature, which enforces game patches that mess with performance—at any time. Also, the way Activision Blizzard is distributing it as a single title with DLCs for every game and multiple restarts just to launch the game is terrible.

Well said.
 
I can only hope the new test system won't bottleneck the RTX5090 too much :p
 
I can only hope the new test system won't bottleneck the RTX5090 too much :p

I mean at 1080p sure I'm doubtful at 1440p or above it'll be too held back.

Kinda depends on how fast it is though it's basically using a slightly refined node the 4090 was already using and is just bigger with way more memory bandwidth most gamers aren't going to be able to afford one anyways after the initial cards dry up and they are going for 2500-3k becuase ai startups will be buying them in bulk and the 5080 isn't gonna be better than what we already have....
 
I mean at 1080p sure I'm doubtful at 1440p or above it'll be too held back.

Kinda depends on how fast it is though it's basically using a slightly refined node the 4090 was already using and is just bigger with way more memory bandwidth most gamers aren't going to be able to afford one anyways after the initial cards dry up and they are going for 2500-3k becuase ai startups will be buying them in bulk and the 5080 isn't gonna be better than what we already have....

If the 5090 is 50-60% faster than 4090, the 5090 @4K = 4090 @1440p

You forgot that Nvidia could very well flood the market with 5090 at launch since they have stopped making 4090 for months now, bringing prices closer to whatever MSRP they chose.

Btw 4090 had way more stock at launch than any previous launches, I didn't have any problem buying 4090 at launch, whereas for previous Geforce launch I had to wait 2-3 months. With 5090/5080 launch Nvidia could bring it up a notch.
 
If the 5090 is 50-60% faster than 4090, the 5090 @4K = 4090 @1440p

You forgot that Nvidia could very well flood the market with 5090 at launch since they have stopped making 4090 for months now, bringing prices closer to whatever MSRP they chose.

Btw 4090 had way more stock at launch than any previous launches, I didn't have any problem buying 4090 at launch, whereas for previous Geforce launch I had to wait 2-3 months. With 5090/5080 launch Nvidia could bring it up a notch.

True but if the MSRP is 2000 usd most models will be 2200-2400 with the stix 2500+ as it is. So it shifting up a couple hundred after launch wouldn't be a surprise. I think with 50% more vram and 50% more performance 2k is probably as low as we can expect. Even with decent supply out here in the states the 4090 sold out immediately and was hard to get till late Feb/March after it's launch I had a bunch of trackers and still had a hard time getting one although I passed on a couple hoping for an FE model.....

@W1zzard Will SAM be active when testing AMD cards in that configuration?

It's the same thing AMD just has their own fancy name for it.
 
They had stuff for sure but never over 30% market share I don't think

For me they've pretty much been the default option since 2014 but before that It was at least a debate.... As far as market share that is probably true though.
 
"I picked the RTX 4070 Ti as the 100% baseline for no specific reason."

Well if you need a reason you could say the ~$1000 baseline.
 
I can only hope the new test system won't bottleneck the RTX5090 too much :p
9950x3d anyways is coming.

I wish the prices on the "performance per dollar" table reflect reality. wish scalpers die the instant they scalp the price.
 
9950x3d anyways is coming.

I wish the prices on the "performance per dollar" table reflect reality. wish scalpers die the instant they scalp the price.

I heard that scalpers got burned pretty hard with the PS5 Pro launch that they probably will think twice about scalping the latest GPUs
 
I heard that scalpers got burned pretty hard with the PS5 Pro launch that they probably will think twice about scalping the latest GPUs
yeah heard and saw local scalpers burn as well, hope NVIDIA does the same, they're already greedy enough to allow more greed on top of their greed. (that didn't sound all too well)
 
I suggest another page: Path Tracing Performance. Especially since the performance with the upcoming RTX GeForce 50 series is going to increase further on these demanding settings become more and more playable.
Hardware Unboxed tested ("Is Ray Tracing Good?", see table 36:58min mark) how visually transformative ray-tracing/path-tracing is and currently only a few games are:
"Transforms Visuals Significantly" (best):
  • Metro Exodus EE
  • Cyberpunk 2077 ("Overdrive)
  • Alan Wake 2 ("High")
"It's Generally Better Overall" (second best):
  • Cyberpunk 2077 ("Ultra")
  • Alan Wake 2 ("Low")
  • + 8 other games
Wonder where Indiana Jones And The Great Circle with the Full Ray Tracing setting would be on this table (and it's another game with path-tracing).
 
Will you do BIOS updates to te rig? For AMD it often change stuff

Do you consider Linux benchmarks? At least for one kinda instance?
 
"I picked the RTX 4070 Ti as the 100% baseline for no specific reason"

4070 ti with only 12gb vram is a bad card and shouldn't be recommended for gaming/ai because at vram is doa. either 4060 ti 16gb wich is a ripoff in EU at 500 euro or 4070 ti super 16gb wich is also a ripoff at eur 1,000. I find myself getting out of vram in modern games and next games will certainly use over 12gb vram. If you look at average 4k fps you found out 4080 super vs 4080 is 1fps difference.something fishy here. also looking from rtx 3060 12gb to rtx 4060 ti super 16gb no major difference. this would means a) ada lovelace isn't great vs ampere b)test aren't relevant to highly difference between 12 and 16gb vram. but real life they make a difference.
I would like to say something about intel not releasing >16gb vram gpus while price per 8gb vram is sub 3$ on spot market.
and something about amd delaying new generation of cards.
but who buys new gpus cards in january, anyway?
 
AMD's "Fine Wine" is turning to vinegar a bit in this round of testing.

Looks like AMD cards are seeing some slippage vs Nvidia across the board. 7900XTX has fallen behind even the standard 4080 and it looks like AMD equivalents have lost a few % vs their NV counterparts up and down the stack.

Bummer.
Thats really not the case lol These were all reference cards for sure AMD AIB cards are running faster than this average FPS of the 7900XTX reference card is 142.9 the 4080 S is at 145.8 a whole 3 fps in 1440P some slippage is a stretch. Especially when you look at their OC capabilities the 4080 S trails behind with the AIB cards. ASUS GeForce RTX 4080 Super STRIX OC Review - Overclocking & Power Limits | TechPowerUp ASRock Radeon RX 7900 XTX Taichi White Review - O_o Sexy - Overclocking | TechPowerUp
 
Last edited:
Seeing as most / all games these days allow some form of upscaling and most gamers use some form of upscaling, I don’t think it unreasonable to include some upscaling benchmark numbers (and would love to see them myself), but realize it’d be quite a bit more work.
 
Seeing as most / all games these days allow some form of upscaling and most gamers use some form of upscaling, I don’t think it unreasonable to include some upscaling benchmark numbers (and would love to see them myself), but realize it’d be quite a bit more work.

-Maybe just take the heaviest RT game and then bench it with 4090/7900xtx/b580 with preferred upscalers and FG enabled to see what happens.
 
Especially when you look at their OC capabilities the 4080 S trails behind with the AIB cards.
OC doesn't count. Most people just want to put the card in the case and be done. They don't care for overclocking
 
Back
Top