• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RTX 5060 8GB performance

Status
Not open for further replies.
yeah 349usd will be for the first 10 people (who will then spend all their time praising it online), 400usd or more for the rest :roll: .

5060 still has no competition at 300usd :wtf:

The 8GB AMD card will have the same core as the 350 usd one that supposed to be faster than the 5060ti..... They will likely be more desirable so I doubt they will hit their MSRPs though.

8GB is ok right.... Right lmao.
 
The 8GB AMD card will have the same core as the 350 usd one that supposed to be faster than the 5060ti..... They will likely be more desirable so I doubt they will hit their MSRPs though.

8GB is ok right.... Right lmao.

Thing is the 8GB 5060Ti fell flat at 1440p (which AMD compares the 16GB 9600XT to)
8GB.jpg


So at MSRP (hah), the 16GB 9600XT will have the same perf/dollar as the 5060, but 16GB VRAM will get them love from those reviewers (that they will immediately forget that DLSS4 have way better game support)
 
Thing is the 8GB 5060Ti fell flat at 1440p (which AMD compares the 16GB 9600XT to)
View attachment 400556

So at MSRP (hah), the 16GB 9600XT will have the same perf/dollar as the 5060, but 16GB VRAM will get them love from those reviewers (that they will immediately forget that DLSS4 have way better game support)


Sure but hubs data isn't accurate according to all the Nvidia fanboys....

W1z has them pretty close together.

average-fps-2560-1440.png

Obviously we need to wait for reviews but it's 80 usd cheaper than Nvidia cheapest card with more than 8GB..... Here in the states even at 400 it will be quite a bit cheaper than the 16GB 5060ti lol.
 
Sure but hubs data isn't accurate according to all the Nvidia fanboys....

W1z has them pretty close together.

View attachment 400560

Obviously we need to wait for reviews but it's 80 usd cheaper than Nvidia cheapest card with more than 8GB.....

HUB use lots of games that weight heavily on VRAM (all Sony ports), I'm quite sure AMD is doing the exact same (or AMD has been gently nudging them to test those games :roll: ).

Looking at HUB FSR4 review, all Sony ports :wtf: aside from Hunts
 
HUB use lots of games that weight heavily on VRAM (all Sony ports), I'm quite sure AMD is doing the exact same (or AMD has been gently nudging them to test those games :roll: )

Probably, and I don't think they will be good cards it's just the 60 series isn't either they only have to be ok if the 16GB model hits 349 because then it comes down to DLSS vs 16GB. FSR4 is quite good though so to me it really just comes down to if this is a bit faster and actually hits MSRP. I haven't been the biggest fan of DLSS SR Transformer because it sucks at disocclusion in 3rd person games. Not a big deal I just use DLSS swapper in those games and use the older model.

I will give AMD a couple bonus points for not saying it's faster than the 7900XTX with the power of AI lmao.....

Hub did bash the FSR support and the 8GB model in their video talking about the 9600 series announcement so at least they are keeping it even.
 
Just wondering that why even release crap like this? We already got a 12GB xx60 card two generations ago, that should be the bare minimum for cards at that segment/price point, leave the 8GB VRAM to the entry-level ones with 64-bit bus. If those aren't released to the desktop market, for laptop GPUs then.

Having 192-bit bus on the xx60 cards shouldn't cannibalize the sales of higher tier cards, as it didn't do with 1060/2060/3060 cards.
 
Probably, and I don't think they will be good cards it's just the 60 series isn't either they only have to be ok if the 16GB model hits 349 because then it comes down to DLSS vs 16GB. FSR4 is quite good though so to me it really just comes down to if this is a bit faster and actually hits MSRP. I haven't been the biggest fan of DLSS SR Transformer because it sucks at disocclusion in 3rd person games. Not a big deal I just use DLSS swapper in those games and use the older model.

I will give AMD a couple bonus points for not saying it's faster than the 7900XTX with the power of AI lmao.....

Hub did bash the FSR support and the 8GB model in their video talking about the 9600 series announcement so at least they are keeping it even.

at 400usd the 9600XT will never be the 5060 killer that HUB wish it was though, it's just out of the price range for budget builders.
 
at 400usd the 9600XT will never be the 5060 killer that HUB wish it was though, it's just out of the price range for budget builders.
If the rumoured $350 for the 16 GB version holds true, then it won't be so bad, imo. 400 would put it too close to the 9070 and 5070.
 
Did I miss something?


Yeah, it's called marketing. They are comparing with previous generations when they only launched flagships. Of course the 9070xt - 9070 are going to sale much more than previous launches, since previous launches were flagship cards that cost up to twice as much. Add in the rebate that kept the prices low for a week - and you got that catchy "10x more than previous gens".
 
Yeah, it's called marketing. They are comparing with previous generations when they only launched flagships. Of course the 9070xt - 9070 are going to sale much more than previous launches, since previous launches were flagship cards that cost up to twice as much. Add in the rebate that kept the prices low for a week - and you got that catchy "10x more than previous gens".
Check out the second link.
 
Check out the second link.
It's mindfactory man. AMD has been outselling - or at least matched - GPU sales on mindfactory for more than half a decade. It's an extreme outlier evident by the fact that the marketshare at the end of the day isn't anywhere close to a 50-50 like the mindfactory sales would suggest.
 
It's mindfactory man. AMD has been outselling - or at least matched - GPU sales on mindfactory for more than half a decade. It's an extreme outlier evident by the fact that the marketshare at the end of the day isn't anywhere close to a 50-50 like the mindfactory sales would suggest.
So what data do you look at when you're quoting sales numbers? Neither AMD nor Nvidia make them public, so they're all estimates as far as I know.
 
So what data do you look at when you're quoting sales numbers? Neither AMD nor Nvidia make them public, so they're all estimates as far as I know.
TBH I don't care that much about sales numbers, im not even sure myself what's the truth, but you should ask @Hecate91 ,he calls nvidia a multi trillion $ monopoly with 90% marketshare every other post. He has his sources I suppose.

But mindfactory is for sure not "reliable", in the sense that it's not representing the entire planet, not even Germany.
 
TBH I don't care that much about sales numbers, im not even sure myself what's the truth, but you should ask @Hecate91 ,he calls nvidia a multi trillion $ monopoly with 90% marketshare every other post. He has his sources I suppose.

But mindfactory is for sure not "reliable", in the sense that it's not representing the entire planet, not even Germany.
All I'm saying is, without any official data, sensationalist news of Mindfactory and such are the only things to go by.

Edit: Also, I've been watching the Microcenter stock alert thread for a while. They also had about 9x as many 9070 cards as 5070 on their respective launch days.
 
All I'm saying is, without any official data, sensationalist news of Mindfactory and such are the only things to go by.
Then we need to stop saying that nvidia has 90% marketshare though, just go check techepiphanys post for the past 5 years. AMD has been dominating GPU sales since forever in mindfactory.
 
Then we need to stop saying that nvidia has 90% marketshare though, just go check techepiphanys post for the past 5 years. AMD has been dominating GPU sales since forever in mindfactory.
I think first we should define market share. That is, define in which time period we're talking about. The last quarter? Or the last year? Or total number of users? Also, what users? Desktop dGPU? Or desktop and mobile? What about console chips?
 
It's an extreme outlier

Wow, ok. Didn't think it would be an extreme outlier.

Germany is a first world country so going by that demographic, I'll go by their sales numbers until I can find some more concrete evidence.

Edit: In any case I wasn't so much highlighting their GPU's sales or what not, it was more about quoting what @LastDudeALive said about "AMD's a $200 billion company that's close to abandoning the discrete GPU market because they haven't given people what they want"

I believe they did give what people wanted. They flew off the shelves and caused a shortage and prices went up. The initial msrp was a pretty decent product vs what NVidia offered at same competing msrp.
 
Last edited:
It would take some extreme out-of-touch reviewers to say this is a bad deal, because of 8GB VRAM and stuff :kookoo:

View attachment 400555
yeah , it is still a bad deal (at least for me personally) ...
i ´m still running 8gb rtx 2070 super (aka very slightly cut down 2080) which i bought for 220€ ($250) and that was two and a half years (basically 3 years) ago !!!
than the 4060 8gb came out and i laughed - because it was barely any faster (not really) than my 2 generations old cut down 2080 ...

now this mess hits the shelves and it is what ? like 30% faster in rasterization (if that)
and i also have exactly the same memory bandwidth and VRAM at my 3 generations and 7 years old class 70(Ti/S) graphics card ...

it´s insane how much the mid range GPUs from nvidia and AMD have fallen over the last 6-7 years ...
if for some reason i would have to get up and purchase a new graphics card tomorrow than
i would just get rtx 3080 for $350 (maybe for $320 after some negotiations with the seller) on a second hand market ...

also the power consumption is not really that big of a deal and it would not affect my bill in any meaningful way -
in my region a kW/h costs only €0,07 ($0,08)
(also i almost forgot we have a "nigh time tariff" discount which is €0,05 from 10pm until 6am -
many times i´m gaming from 10pm to 1am so it is also very relevant for me)
after some calculations it would cost me exactly $50 annualy
to run a graphics card with average gaming consumption of 350w if i game 365 days x 5 hours each day on it
(at full $0,08 price all the time , which there is no chance for me to do)
so yeah there is that - just in case you decide to throw in power consumption as a factor ...
 
Last edited:
MSRP (hah)
XD!

it is still a bad deal
THIS^

it´s insane how much the mid range GPUs from nvidia and AMD have fallen over the last 6-7 years ...
Read S***hole range ;)

Page 7 we're on :rockout:

First of all, I know RTX 5060 is not primarily suited for 4K ultra gaming. I really get it.
How come this GPU is worse in 4 out of 8 games tested in 4K than RTX 4060? And not by a small margin.
RTX 5060 has 448 GB/s memory bandwidth, RTX 4060 has 272 GB/s. Doesn't make sense those min. fps.


View attachment 400412

View attachment 400413

View attachment 400414

View attachment 400416

Souce: https://www.pcgamer.com/hardware/live/news/nvidia-rtx-5060-review-doing-it-live/
That's just one source.

Gather data from more than one.
 
Wow, ok. Didn't think it would be an extreme outlier.

Germany is a first world country so going by that demographic, I'll go by their sales numbers until I can find some more concrete evidence.

Edit: In any case I wasn't so much highlighting their GPU's sales or what not, it was more about quoting what @LastDudeALive said about "AMD's a $200 billion company that's close to abandoning the discrete GPU market because they haven't given people what they want"

I believe they did give what people wanted. They flew off the shelves and caused a shortage and prices went up. The initial msrp was a pretty decent product vs what NVidia offered at same competing msrp.
It's not Germany's market though, it's specifically mindfactory. They have a history of working with AMD (offers, rebates, bundles, lower prices than other shops) so it's safe to assume that everyone in Germany that wants to buy AMD buys from mindfactory - which kinda skews the results.
 
We already got a 12GB xx60 card two generations ago, that should be the bare minimum for cards at that segment/price point, leave the 8GB VRAM to the entry-level ones with 64-bit bus.
THIS^ should be pinned!

Having 192-bit bus on the xx60 cards shouldn't cannibalize the sales of higher tier cards, as it didn't do with 1060/2060/3060 cards.
They know that wholeheartedly, it's just that they are NOT willing to provide such solutions at the moment!

12GB 5060 SUPER (128-bit 3GB memory modules) is coming our way at CES 2026.

They also had about 9x as many 9070 cards as 5070 on their respective launch days.
Interesting info there!

I haven't been the biggest fan of DLSS SR Transformer because it sucks at disocclusion in 3rd person games. Not a big deal I just use DLSS swapper in those games and use the older model.
Thanks for sharing!
 
When it comes to longevity, people just focus on the amount of vram but not a whole lot on how fast it is. Games that like faster vram - the 5060 is a beast, matching or beating much more expensive cards. In God of War, Plague tale, Hogwarts for example it's 45% (!!!) faster than the 4060 (hubs review) and matches or beats the much more expensive 7700xt. I'd be wary of buying cards with outdated slow GDDR6 vram, it seems it will be a limiting factor in the next few years and performance will drop like a potato. I mean it's not horrible tbf, it's still fine for esports etc. but I wouldn't spend more than 250$ for GPUs with GDDR6 for AAA gaming in 2025. Cards are expensive as heck nowadays, so since you are paying through the nose regardless, buy something that lasts, don't settle for slow vram.


image_2025-05-21_100333795.png
 
When it comes to longevity, people just focus on the amount of vram but not a whole lot on how fast it is. Games that like faster vram - the 5060 is a beast, matching or beating much more expensive cards. In God of War, Plague tale, Hogwarts for example it's 45% (!!!) faster than the 4060 (hubs review) and matches or beats the much more expensive 7700xt. I'd be wary of buying cards with outdated slow GDDR6 vram, it seems it will be a limiting factor in the next few years and performance will drop like a potato. I mean it's not horrible tbf, it's still fine for esports etc. but I wouldn't spend more than 250$ for GPUs with GDDR6 for AAA gaming in 2025. Cards are expensive as heck nowadays, so since you are paying through the nose regardless, buy something that lasts, don't settle for slow vram.


View attachment 400568
"Outdated, slow" GDDR6 is still infinitely faster than swapping data through the PCI-e bus (I don't understand where you got the idea that it's outdated and slow).

Also, frames per second isn't the only metric where more VRAM shows. Quite a few modern games have texture loading issues, pop ins, slow downs after long sessions, etc. when they exceed your VRAM, but act fine during a benchmark run, or when looking only at your FPS.
 
"Outdated, slow" GDDR6 is still infinitely faster than swapping data through the PCI-e bus (I don't understand where you got the idea that it's outdated and slow).

Also, frames per second isn't the only metric where more VRAM shows. Quite a few modern games have texture loading issues, pop ins, slow downs after long sessions, etc. when they exceed your VRAM, but act fine during a benchmark run, or when looking only at your FPS.
Well it is slow, that's not like debatable. New gddr7 is faster.

Im comparing cards with equal amount of vram (4060 vs 5060) where their only real difference is the vram speed. In games that take advantage of that, the 5060 is 45% faster. So I wouldn't buy GDDR6 in 2025 unless it was a bargain, when GDDR7 can get 45% faster. I mean think about it, 45% is completely crazy.
 
Well it is slow, that's not like debatable. New gddr7 is faster.
The Bugatti Chiron is faster than a Ferrari. That doesn't mean that a Ferrari is slow.

Im comparing cards with equal amount of vram (4060 vs 5060) where their only real difference is the vram speed. In games that take advantage of that, the 5060 is 45% faster. So I wouldn't buy GDDR6 in 2025 unless it was a bargain, when GDDR7 can get 45% faster. I mean think about it, 45% is completely crazy.
That could be the difference in core config, too (3840 vs 3072 cores). If it was only the more modern VRAM, then we'd see the same increase all across the board.

Edit: I find it more likely that we don't see an increase where VRAM is the issue, and we do where it isn't, and the higher number of cores get their chance to shine.

Edit 2: Let's not forget that the 4060 - 5060 is the biggest core number upgrade we've got this gen from Nvidia, if you don't count the 5090.
 
Last edited:
Status
Not open for further replies.
Back
Top