• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5060 8 GB

HWUB has show that even at 1080p 8GB is not enough. Even in the scenarios where performance doesn't take a dive, visual quality may take a hit. That's something not addressed in this review.
IMO it is generally addressed in W1zzard's wall-of-text conclusion. (I say this jokingly. I actually really like these detailed conclusion notes.) From the conclusion:
If you play with the settings carefully, mix and match upscalers and frame generation, you'll still be able to achieve a decent gaming experience in many games[.]
If you're not willing to work the settings, do not buy an 8 GB card in 2025.
Ideally, there would be a 12 GB VRAM model of the RTX 5060[.]

Since Hardware Unboxed started showing in videos that for the GPUs with lower than needed VRAM the textures don't load properly and they are blurry and low quality, it is a MUST that reviewers should check if that happens and show that in the charts next to the FPS number in order for the buyers to know if the GPU is good enough or not to play properly the specific game.
Games HWUB has documented as dynamically dropping textures with 8GB: Halo Infinite, Forspoken, Hogwarts Legacy, and Ratchet & Clank. (Am I missing any?) All other games will stutter or crash unless you lower the texture quality setting, and we can see that in graphs of FPS and 1% lows.

I suspect the reason W1zzard doesn't feel it's necessary to prominently mention this is because he simply hasn't been bothered by the dynamic texture issues when playing, if he's even observed it. It's subjectively less distracting for some people. HWUB folks are particularly sensitive to low quality textures so they focus on it (or if you're feeling conspiratorial, you'd say they hype it for more YT views). HWUB also suggests a base frame rate of 80+ before enabling frame gen. W1z on the other hand seems okay with frame gen on top of 45-50 FPS for slower 3rd-person games.
 
I do believe that RTX 5060 is one of the best products from RTX 5000 lineup. It has decent generational uplift, maintain its pricing and makes nearby card seem redundant. Another good card is RTX 5070 Ti in my opinion and in third place RTX 5090. The issue is that each card has some issues which makes them underwhelming. RTX 5070 and RTX 5080 lacks performance. RTX 5060 Ti is overpriced for its specifications and RTX 5060 costs quite less while being nearly just as good. Sure, it could be better, but it is what it is.

RX 9060 XT has potential being a very strong competitor to RTX 5060 and RTX 5060 Ti. However, I have my every doubt that AMD will continue being AMD during release of that card and will fuck it up somehow. This time, it is fake MSRP. Maybe they are back to their old tricks of lying about their own benchmarks too? Who knows, they are quirky like that and this is why people love them.

In regards of what Nvidia should or shouldn't give, it is not up to consumer to decide. Gamers always love to treat companies and corporations as their friends and family, expecting free ride from them. This attitude is pointless and only worked, because we were riding an amazing wave of silicon manufacturing advances. In few decades we made massive investments and experienced unprecedented success. However, we had exhausted easy gains and technology has matured. Gamers will have to get used to RTX stuff. One great generation and then one bad one. If industry is in a bad shape then that bad generation is extended to 4 years and is released multiple times until all cards become what they should had been from the start. Blackwell for example fixed the low end while Lovelace delivered us high end. We are 4 years on basically same family of GPUs.
 
Kudos to Nvidia for making what AMD CAN'T.
The last time AMD made such a small, but not dog crap slow GPU was Radeon R9 Nano.

View attachment 401628View attachment 401629

Ackshually there's Powercolor 5700 ITX, Vega 56 Nano

Not sure I follow you. The card reviewed here is a standard slot width, not a low profile. Are you referring to a possible LP version of this card? As far as I know if money is no object RTX A4000 SFF is King of the Hill for low profile. Then GTX4060 LP, then RTX A2000. Pretty sure Sparkle is focused on Intel's ARC pro B50/60, which are scheduled to be released in mere weeks (Q3), which I am quite excited for.

View attachment 401563View attachment 401564
There is apparently a Sparkle "Genie" LP B570, so you can get performance without paying 2x price for VRAM and workstation branding, depending on your preferences...it's >2 slot though and long-ish for LP

Still looking for a R9 Nano / single fan ITX card that has more than 8GB of VRAM, there were single fan 4060Ti 8GBs, but then there weren't any ones for the 16GBs...OEMs / AIBs need an extra fan to cool 8GB extra of VRAM chips...? :rolleyes:
 
Why is there no e.g. rx7600xt (16GB!) in the table, which supposedly draws 4W in idle? (sapphire), and 5060 as much as 9W. But there are stupidities like pciex5.
The review was written for nvidia and it shows.
Calm down. It is in the table. 14 watts. You should ask W1zzard (nicely!) why it's 2x/3x higher than when it was first tested last year. And I would say that he's done Nvidia no favors with this review by withholding all recommendation badges.

Well, these idle power consumption numbers are unrealistic and a result of systematic measurement error.
There is no way this little card draws 9W in idle, more likely 0.9W.
So you've tested this on your own bench? Or at least can link to someone else's measurements? Let's keep the level of discourse here high, and not muddy things with baseless accusations.



There's so little respect these days for expertise. Anti-intellectualism is celebrated at all levels of society. Eager and uncritical belief in conspiracies is on the rise. What can I do to improve the situation, other than keep replying every time someone is wrong on the Internet?
 
Just a little tip, Nvidia is 90% there to make money.

Yall complain about prices, yet basically prices were locked for over 20 years. They just got with the times, much like game makers now charging well over the standard 50-60 bucks.

We are just a drop in an enormous pond. Nvidia has much better things to make for cars, Ai, etc. Knowing what they get for those products, with much higher demand than gpus, I feel blessed they don't just walk away from us.
What is the other 10%? Nvidia is 99.999999999999999% there to make money, it's them not even trying anymore that is part of the problem. Also before someone says "they're not trying but still winning" well yeah when one company has 1000000x the money and nearly all of the market, they have more to spend on R&D.
But Nvidia as a company is capable of focusing on cars, AI, and the gaming market, they should either serve the gaming market or stop inflicting the stagnation upon a majority of the low end and mid range market and go fully to AI and cars.
I feel blessed I can walk away and buy something from Intel or AMD.
 
HWUB has show that even at 1080p 8GB is not enough.
And that is absolutely false rubbish. I've said it before and I'll repeat it here: HUB is full of moose muffins. Rubbish reporting at best.

I disagree. 1080p is the lowest, cheapest.
No. The lowest and cheapest is still 720p, and those screens can be had everywhere.
Sweet spot means something in balance, in the middle.
And that's what it is. 720p/768p screens are still a thing that can be had brand new. So...
There is nothing lower than 1080p.
...no. That is patently false.
 
News are just in. Blackwell sells really well. Straight from Nvidia's earning reports.
  • First-quarter Gaming revenue was a record $3.8 billion, up 48% from the previous quarter and up 42% from a year ago.
image_2025-05-29_003244448.png


Though, a little bit of topic, but talking how Nvidia should give us more is also off topic. The truth of the matter is. They are sellers. We are buyers. If product is good enough, we buy it. If not, then we do not. Market had clearly spoken that even Nvidia's third best, lackluster effort is still hands down the best thing on the market. Maybe AMD will see similar GPU sales growth in the future, but considering the hole they are currently in, they need a lot more than mere 50 percent growth.
 
Last edited:
Nodes are constantly getting smaller and yet historically die sizes of each tier of GPU haven generally gone up or down within a range that was normalized for each tier.

It's only recent Nvidia GPU generations wherein die size has gone drastically under the typical die size for each tier, excluding of course the 5090. The 5090 of which also breaks your logic here as the die size went up. Mind you it certainly isn't the only GPU to see it's die size increase.

Surely you jest given the 5080 and 5090's power consumption.

Architecturally speaking Blackwell provides margin of error improvement to power consumption (for all intents and purposes, 0) so we know for a fact Nvidia did not prioritize power consumption.

The low power consumption of Nvidia's mid and low end cards is the result of the small die sizes, because you are getting physically less. It's akin to calling a mini kitkat healthy because it has 1/8th the calories of the regular sized candy.
Did you miss when I said "in this segment", Nvidia is prioritizing cost control over performance gains? The 5080 and 5090 are in different segments than the 5060.

The simple fact is, the 5060 isn't for you, or anyone on this forum for that matter. This GPU is for prebuilts and laptops, bought by ordinary consumers, and 90% of the playtime will be spent on esports and casual games. In that segment, keeping the cost at $300 gen-on-gen (significantly down from the 3060's price), and a decent performance gain is exactly what Nvidia figures this market segment (again, NOT you or anyone else here) wants.

News are just in. Blackwell sells really well.,
Can't wait for the Steam Hardware survey next month, as long as it's not China-skewed
 
And don't forget old games. Go to Top steam most played games. I will do it for you. We need to scroll down to Top 35 Cyberpunk and Top 49 Monster Hunter Wilds until RTX 5060 will face any challenge running games well. 1080p or 1440p. Not to mention that a lot of people play old games or some indie or AA niche title like Project Zomboid. Casual gamers are not likely to encounter difficulties in games they play. Not to mention, everyone I know do not need more powerful GPU. My friends in university? Fifa and League of Legends addicts. My friends now? Small indie games or some trending games like Baldur's Gate 3, Palworld, etc. RX 7600 and RTX 3060 12 GB served them great all this time. I play a lot of old games or undemanding esports games. I get several flagship games where it really pushes my GPU to the limit before next upgrade cycle.

We could argue that it could be better and I don't disagree with you there. It could, but it isn't. Nor anyone else is capable of producing a better card. B580? Demolished by RTX 5060 performance and is pretty much irrelevant. Sure, it has that extra VRAM, but people are not going crazy for it. AMD could release RX 9060 XT, but I have my doubts if it is not 400 dollar card in actuality which would make it irrelevant.

I don't like this entitlement from people. I remember when I bought my first card - GTX 760. I was happy to play Crysis 3 at 40 FPS in 1080p or heck, it might had been 720p something. 60s series card was always a compromise, but people out of nowhere feel like this is 1440p card now. Only 50s card is 1080p card. Like, I'm hearing a lot of wild statements of that things should be. It is ultimately not healthy and will lead to disappointment. Especially since due to slowdown in nodes, we physically cannot deliver anymore same uplift generation per generation. It will be more about software and features and honestly, those are great too.
 
Last edited:
I disagree entirely. 1080p is a sweetspot resolution that is affordable. The video card companies have nothing to do with it. I'm not going into that further.
So here is a brief history.

World's first electronically scanned television service started in Berlin in 1935.

30 years later we got an updated standard with the Color Transition in 1965.

The first public HD broadcast in the USA was in 1996. The ATSC launched in 1998. Congress in 2006 voted to force the HD upgrade by 2009.

We are supposedly in the golden age of technology yet we have been able to use 1080p since the early to mid 1990's. It has been nearly 30 years since the first HD public broadcast.

I was able to play some games on my GTX 970 in 2014 at 1440p. I mean in 2008 even we had monitors like the 30" Gateway xhd3000 monitor with a native resolution of 2,650 x 1,600.

Yet we can't even move the bar in PC gaming after all this time to 1440p or what some call 2k? Let alone 4k? Please.

News are just in. Blackwell sells really well. Straight from Nvidia's earning reports.

Blackwell this generation includes everything, for instance the previous generation you had them broken up with Ada being consumer and Hopper for Data Centers and the like.

So, RT has become Nvidia’s own worst enemy? Giggles echo in the background. Jokes aside, it would be a decent product if it weren’t for the VRAM limitation. I’m expecting a 5060S 9GB to drop soon

Yet you can play Indiana Jones with Vega in Linux with Ray Tracing via software.
 
Last edited:
The dilemma of choosing between performance and price...

No doubt, a standard 60-class card that can't outperform last Gens 4060 Ti and even lags behind the two-Gen-old 3070, while being limited to just 8GB is hardly going to impress anyone.

But... based on the inflated GPU market, lack of compo and a couple of no-longer available 12GB sub £300 cards (esp. the 6750 XT) this gets a thumbs up for the perf/$ offering. Whether skimped or not, the state of the market makes it count.

The rest just boils down to informed decisions (or the lack thereof).
 
I just bought the EVGA 3070 FTW3 Ultra for 315$ like three days ago. Have I made a mistake? I'm still on AM4, 5600 non X

that doesn't seem like a great deal

So, RT has become Nvidia’s own worst enemy? Giggles echo in the background. Jokes aside, it would be a decent product if it weren’t for the VRAM limitation. I’m expecting a 5060S 9GB to drop soon. :p


View attachment 401651

i played that game at 1440p with 8gb vram, i just didn't use RT, the game run fine.
Do you need RT to play the game? no
Can you play the game with RT with more vram and paying more money for more vram? sure.

Most ppl aren't turning RT on anyway. Being a bottom of the barrel GPU not sure that matters. If you buy this, price is probably the biggest concern.
 
So, RT has become Nvidia’s own worst enemy? Giggles echo in the background. Jokes aside, it would be a decent product if it weren’t for the VRAM limitation. I’m expecting a 5060S 9GB to drop soon. :p


View attachment 401651

- 3GB Chips would get it to 12GB no problem. Knowing NV they'd probably go with a 3+3+2+2 set-up to keep costs down but get it to a more comfortable 10GB pool.
 
I just bought the EVGA 3070 FTW3 Ultra for 315$ like three days ago. Have I made a mistake? I'm still on AM4, 5600 non X
that doesn't seem like a great deal
I agree it's not a great deal. (I have the same model.) But the 3070 has a x16 PCIe interface, so if your graphics card slot is PCIe 3.0 then it's not the worst choice. You won't want to use the x8 4060/5060 cards on PCIe 3.0.

What's your motherboard?
 
I just bought the EVGA 3070 FTW3 Ultra for 315$ like three days ago. Have I made a mistake? I'm still on AM4, 5600 non X

- Honestly 3070 levels of performance should by all rights be about $150-200 at this point.

You're getting a card that's a little slower than the 5060, same VRAM pool, but also 5 years old.

Shouldn't be paying new price/performance for it by a longshot.
 
So here is a brief history.

World's first electronically scanned television service started in Berlin in 1935.

30 years later we got an updated standard with the Color Transition in 1965.

The first public HD broadcast in the USA was in 1996. The ATSC launched in 1998. Congress in 2006 voted to force the HD upgrade by 2009.

We are supposedly in the golden age of technology yet we have been able to use 1080p since the early to mid 1990's. It has been nearly 30 years since the first HD public broadcast.

I was able to play some games on my GTX 970 in 2014 at 1440p. I mean in 2008 even we had monitors like the 30" Gateway xhd3000 monitor with a native resolution of 2,650 x 1,600.

Yet we can't even move the bar in PC gaming after all this time to 1440p or what some call 2k? Let alone 4k? Please.
HD is 720p. 1080p is "FHD"
 
That appears to be a chart for the 16gb card. Testing I have seen for the 5060 ti 8gb card is 10-11 percent on average. Which is a pretty big hit.
Where did you get those results? I'd believe it for edge cases, but that's an awfully high number for an average.

A 6600XT for example, loses between 4-7% when hobbled by PCI-E 2.0
An RTX 3080 gets dinged 4% when hobbled by 2.0
 
5060 Ti loses almost no performance at 4.0 x8. I doubt this card will either.

That's the 16GB model. With the 8GB model there is a loss.

The dilemma of choosing between performance and price...

No doubt, a standard 60-class card that can't outperform last Gens 4060 Ti and even lags behind the two-Gen-old 3070, while being limited to just 8GB is hardly going to impress anyone.

But... based on the inflated GPU market, lack of compo and a couple of no-longer available 12GB sub £300 cards (esp. the 6750 XT) this gets a thumbs up for the perf/$ offering. Whether skimped or not, the state of the market makes it count.

The rest just boils down to informed decisions (or the lack thereof).

Competition will be here in 8 days. We just have to wait and see if it is any good.

I agree it's not a great deal. (I have the same model.) But the 3070 has a x16 PCIe interface, so if your graphics card slot is PCIe 3.0 then it's not the worst choice. You won't want to use the x8 4060/5060 cards on PCIe 3.0.

What's your motherboard?

The competition I mentioned coming in 8 days has a full x16 link. That would be ideal for PCIe 3.

Where did you get those results? I'd believe it for edge cases, but that's an awfully high number for an average.

A 6600XT for example, loses between 4-7% when hobbled by PCI-E 2.0
An RTX 3080 gets dinged 4% when hobbled by 2.0

Depending on the game there can be a noticeable drop in performance:

 
Well, these idle power consumption numbers are unrealistic and a result of systematic measurement error.
First time I hear about this. Where do you think is my systematic measurement error?

I use calibrated lab test equipment to measure current and voltage going across the slot, and current and voltage across the 8-pin, then P = U * I

Not sure what could be done differently? Or are you looking at the software reported power in GPU-Z hope that might be reality?
 
First time I hear about this. Where do you think is my systematic measurement error?

I use calibrated lab test equipment to measure current and voltage going across the slot, and current and voltage across the 8-pin, then P = U * I

Not sure what could be done differently? Or are you looking at the software reported power in GPU-Z hope that might be reality?

He's been in his own reality since he joined this forum.....
 
That's the 16GB model. With the 8GB model there is a loss.



Competition will be here in 8 days. We just have to wait and see if it is any good.



The competition I mentioned coming in 8 days has a full x16 link. That would be ideal for PCIe 3.



Depending on the game there can be a noticeable drop in performance:

Thanks for posting that hadn't had a chance to find the link and post it myself.
 
HWUB folks are particularly sensitive to low quality textures so they focus on it
I guess they are an isolated group, these odd people sensitive to low quality textures on 2025 new gpu, instead on focusing on multiframe generation and path tracing, they ruin everything by showing the GPU doesn't actually render textures at intended resolution.
So how do you actually interpret benchmark numbers on an Nvidia GPU with 8gb if the textures are lowered no matter the game preset ? can they be compared with a gpu that has 10gb or 12gb ? do you cherry pick games that don't go over 8gb ? who knows, some might just not say anything and move on.
 
Back
Top