• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5060 8 GB

The same energy need to be applied to the 9060XT 8GB review regardless of if it's 10-20% faster for sure. I will say at least AMD is giving gamers a choice and offering a 16GB model but when they hit retail my guess is they'll be expensive not becuase they are good but becuase the 5060/5060ti 8GB are just that bad. Only a company as big as Nvidia could help sell AMD cards over msrp lmao.



It's starting to feel like 50 shades of green in here.... :laugh:
If they manage to top the performance / value chart ill be all over the 9060xt. I don't see brands unlike other people. But with that said - whoever called this a waste of sand should (but won't, because brand) consider the 9060xt also a waste of sand and absolutely horrible, cause frankly I don't see it being meaningfully faster or a better value.
 
The same energy need to be applied to the 9060XT 8GB review regardless of if it's 10-20% faster for sure. I will say at least AMD is giving gamers a choice and offering a 16GB model but when they hit retail my guess is they'll be expensive not becuase they are good but becuase the 5060/5060ti 8GB are just that bad. Only a company as big as Nvidia could help sell AMD cards over msrp lmao.



It's starting to feel like 50 shades of green in here.... :laugh:
Exactly, people want fairness towards these 8GB cards, so then the same opinion needs to be applied to the 9060XT. I think AMD should've only went with a 16GB card, though the good thing is the MSRP for the 16GB is the same for what the 8GB 5060Ti sells for. Then again MSRP hasn't held up on any of these GPU's, MSRP is just as flawed as making claims of price/performance without actually talking about the performance of the card itself.
The 5060Ti review thread felt like 50 shades of green, now people are just passing around the kool-aid.
f you want to put the blame on GPUs for not getting "good enough" performance in games then you have to put most of the blame on amd, since they have the slower cards, the cards with the less value (totally objective, performance / $) and the cards that currently and even more historically have been doing horribly in heavy games (RT / PT). But, im not blaming either of them, games have become notoriously heavy for not a lot of gains in visuals.

If you are saying that 8gb vram is "planned obsolescence" why the heck are you just mentioning nvidia? AMD also offers 8gb vram cards. In fact, nvidia offers the CHEAPEST gpus that have more than 8gb of vram. Amd does not. So - AMD is the prime culprit of the tactic called planned obsolescene - if that's your argument.
I'm putting most of the blame on Nvidia because they have 90% of the market and the power to change the rest of the market in terms of VRAM and performance at the low end, but chose not to, instead milking the budget gamers for the 3rd time in a row with an 8GB card that doesn't even beat out the previous gen just one tier higher.
And you're not going to get decent RT performance out of a 5060, I doubt anyone buying an 8GB card even gives a crap about RT, let alone PT (interesting how the goal posts move to PT once the competition has improved with RT).
Having only 8GB is planned obsolescence no matter who is selling it, but you keep ignoring that Nvidia has the marketshare and mindshare with a chance to move the low end to 10 or 12GB, but still hasn't chosen to because of their disdain towards the consumer keeping a GPU any more than a single launch cycle.
This is pretty clear, when it comes to limited vram and planned obsolescence - nvidias cheapest gpu with more than 8gb vram costs 220$ and is at the top of the value chart, amd's cheapest gpu with more than 8gb of vram costs 400$ and is at the bottom of the value chart. But somehow we have to keep talking about ngreedia...


View attachment 401605
Except you can't just talk about price/performance without looking at the performance of the card itself, it's slower than an RTX 3070 from 5 years ago, no progress from the 4060Ti from 2 years ago, and the 5060 has higher power consumption than the 4060. IMO, these things should be considered when purchasing an 8GB card, not just blindly buying a card on a single metric.
 
Last edited:
Gigabyte 5060 LP will be selling soon
View attachment 401565
Newegg had this at launch for $339. Zotac and I think one other company also showed off low profile options.

Sparkle announced but hasn't yet released their Arc B570 low profile, which I think is set for Q3?

To my knowledge their are no announced Radeon 9060xt low profile or single fan cards.
 
The difference is that most websites and YouTube Channels have all the most high end setups to also test side by side. They can see what native vs fake frames and added latency looks like vs what it looks like native. Maybe some games perform better better between fake frames and 8gb of Video Memory even old games like Control won't generate the proper frame because bad frames can't be turned into good frames. So any limitation you have will just duplicate itself.

The truth of the matter is the most beneficial place for these technologies is when you already have super high frames like 90 or 100 and then are using these tech in addition to it. Even then though you add latency. Which might be negligible in some games but I have seen some really bad cases of when latency is just really bad.
There is no truth on that, as i said i stand with the reviewer on the assessment, there is a massive gaslight going on about the actual reallity of DLSS4 on these low-end GPU's, this site is one of the few reviewers that has been honest about the experience you'll have when upscaling using quality DLSS4 from 1080p and activating framegen x2 from a 45-50 fps base.
 
I just won an RTX 5060 Msi Ventus Plus OC in a contest, and i'm amazed by the visual sharpness of the games ( playing in 4k) owning previously an RTX 3070 and 3080, it seems that Upscaling and picture quality of Blackwell is very noticeable, people are negatively focused on some gimmicky things but ignoring that 5060 Is beating 3060 ti, 3070, 4060, 4060ti both 8 and 16gb hands down, and i am certain that with the driver updates it will get a lot better. the only thing i think Nvidia should have done is give the memory option on 5060 to 12 or 16 Gb in addition to 8 gb but the memory bandwidth is a tremendous 448 Gb/s (vs 288 for 4060 ti) for an entry level card. I don't know what's the problem with those attention seeking drama queen scumbags on Youtube making all this Fuss. Seeking attention and creating fake controversy to get more attention and have more subscribers and views and thus more sponsors and more Money.
This card was so mis characterized by these Youtube scumbags & Leeches and yet it has a great performance/cost and great end user experience and allows more with DLSS 4 , MFG and other features... it's better than previous gen cards like 60 and 70 with a giant leap in Chip & memory performance but the reviews were killing it before it even was released !!!!! We live in really strange Times
I'm happy for your free graphics card prize, but your comment is deranged.

There's no change in picture quality between architectures. 5060 is below all the cards you listed, sans 3060Ti, on all resolutions. "Scumbags" is a word that should be reserved for people that severely harm others, and not just those you disagree with. Common nouns only get capitalized when they are the first word in a sentence.
 
I'm putting most of the blame on Nvidia because they have 90% of the market and the power to change the rest of the market in terms of VRAM and performance at the low end, but chose not to, instead milking the budget gamers for the 3rd time in a row with an 8GB card that doesn't even beat out the previous gen just one tier higher.
And you're not going to get decent RT performance out of a 5060, I doubt anyone buying an 8GB card even gives a crap about RT, let alone PT (interesting how the goal posts move to PT once the competition has improved with RT).
Having only 8GB is planned obsolescence no matter who is selling it, but you keep ignoring that Nvidia has the marketshare and mindshare with a chance to move the low end to 10 or 12GB, but still hasn't chosen to because of their disdain towards the consumer keeping a GPU any more than a single launch cycle.
If you have 90% of the marketshare you have 0 incentive and reason to push better products. It's your competition that is approaching 0% that is supposed to...well, try to compete by offering better products. Nvidias has the best value cards and the cheapest cards with over 8gb of vram. AMD in the meanwwhile is hardcore dropping the ball. Period. Stop putting the blame on nvidia just cause you hate the company. Facts don't really care about how you feel about them. So realistically, the company that is fast approaching 0% marketshare and does nothing to change that has more disdain towards their consumer. Yet you keep defending them, god knows why.

We are blaming nvidia for AI but hey, who removed themselves from the high end - big die market to keep their dies for AIAIAIAIAI? It wasn't nvidia last time I checked.

Except you can't just talk about price/performance
Yes, you can. That's the whole point of objective things like...MATH. If it has the best performance / $ then every card has worse perf / $. Your opinion is irrelevant.
 
If you have 90% of the marketshare you have 0 incentive and reason to push better products. It's your competition that is approaching 0% that is supposed to...well, try to compete by offering better products. Nvidias has the best value cards and the cheapest cards with over 8gb of vram. AMD in the meanwwhile is hardcore dropping the ball. Period. Stop putting the blame on nvidia just cause you hate the company. Facts don't really care about how you feel about them. So realistically, the company that is fast approaching 0% marketshare and does nothing to change that has more disdain towards their consumer. Yet you keep defending them, god knows why.
When you have 90% of the market you're supposed to be pushing the market forwards, not stagnating on the same 8GB mediocre card with a new name for years. Nvidia is pulling the same tactic Intel did with their quad cores, the difference Nvidia gets praised for it.
And see, AMD gets accused of not competing when they make a decent price at a lower price, and blamed for not magically selling a better card than Nvidia for less, even though Nvidia would launch another Ti or Super card the instant AMD launches anything faster. Sure, the company which has the console market and isn't holding back the majority of the gaming market with 8GB cards is to blame, you're clearly defending nvidia to the point of being delusional because you like the company.
We are blaming nvidia for AI but hey, who removed themselves from the high end - big die market to keep their dies for AIAIAIAIAI? It wasn't nvidia last time I checked.
The high end which is only a fraction of the GPU market, all very loyal to one brand no matter what AMD does. Nvidia kept all their big dies for AI while increasing prices for smaller dies for the gaming market,
using AI to write drivers, but no we're supposed to blame AMD for everything.
Yes, you can.
No, you can't just throw up a price/ performance graph without comparing performance to other GPU's. The math doesn't add up there.
 
If you have 90% of the marketshare you have 0 incentive and reason to push better products.
I am in that 90%, and Nvidia is incentivized to convince me to upgrade.

Except you can't just talk about price/performance without looking at the performance of the card itself, it's slower than an RTX 3070 from 5 years ago, no progress from the 4060Ti from 2 years ago, and the 5060 has higher power consumption than the 4060. IMO, these things should be considered when purchasing an 8GB card, not just blindly buying a card on a single metric.
Yes, you can. That's the whole point of objective things like...MATH. If it has the best performance / $ then every card has worse perf / $. Your opinion is irrelevant.
Objective measurements are only one factor for the actual (subjective!) question that matters: how much am I enjoying my game? And that's affected by all sorts of factors that don't show up on a perf/$ chart.

It's also a non-linear result. I might feel a $300 card that gives an unsteady 50 FPS is "not enjoyable; quit game" and $450 for a steady 60 FPS is "enjoyable; play game". On a perf/$ chart, both look like reasonable upgrades for someone who currently owns a 30 FPS tier card. But practically, only one is worth spending money on.
 
I am in that 90%, and Nvidia is incentivized to convince me to upgrade.
And they do, by offering you both the most valuable cards, the fastest cards for every single task available (gaming, cuda, AI, you name it), and the cheapest cards with high vram.. At some point their competition needs to also start incentivizing cause this is embarrassing. Nvidia leads on every metric that matters, amd leads on amount of deranged fans on the internet.
 
And they do, by offering you both the most valuable cards, the fastest cards for every single task available (gaming, cuda, AI, you name it), and the cheapest cards with high vram.. At some point their competition needs to also start incentivizing cause this is embarrassing. Nvidia leads on every metric that matters, amd leads on amount of deranged fans on the internet.
(I swear I didn't look at any benchmarks before coming up with this hypothetical.) Let's say I'm a huge Monster Hunter fan and I own a 2060 Super, which performed well for the last Monster Hunter game but can't handle the new one. What should I buy to play Monster Hunter Wilds? Remarkably, the 9070 XT has almost exactly the same FPS for MHWilds that the 2060 S had for MHWorlds, and it crushes Nvidia for value; it's tied with the 5080.
 
(I swear I didn't look at any benchmarks before coming up with this hypothetical.) Let's say I'm a huge Monster Hunter fan and I own a 2060 Super, which performed well for the last Monster Hunter game but can't handle the new one. What should I buy to play Monster Hunter Wilds? Remarkably, the 9070 XT has almost exactly the same FPS for MHWilds that the 2060 S had for MHWorlds, and it crushes Nvidia for value; it's tied with the 5080.
Sure, and then you get into some path tracing and it's slower than a 5060ti. That's why - you take the averages (if you play a large variety of games). And when you take the averages, nvidia is topping the value charts.
 
What's interesting is if you look at the xx60 level performance relative to other NV cards of a gen at 1080p, the 3060 had by far the smallest delta between itself and a 3090ti (delta 88%) while the 5060 has the largest delta vs the 5090 (delta 157%). 4060 has almost the same delta to the 4090 as the 5060 does (delta 151%).

OTOH, since everything below the 90 card on the 5xxx series stagnated, the 5060 is relatively better vs the 5080 (delta 102%) than the 4060 was to the 4080 (delta 113%). So it sort of feels like progress at the end of the day. 3060 is still the king with a delta 61% vs the 3080 10GB.
 
If they manage to top the performance / value chart ill be all over the 9060xt. I don't see brands unlike other people. But with that said - whoever called this a waste of sand should (but won't, because brand) consider the 9060xt also a waste of sand and absolutely horrible, cause frankly I don't see it being meaningfully faster or a better value.
My prediction according available synthetic benchmark with comparison between this same architecture.
OpenCL 68.7% Performance of RX 9070XT
Vulkan 68.5% Performance of RX 9070XT
and this will be above RX 7700 XT closes perf GPU will be RX 6800 XT in games.

Testing this same method on previous gen

7600xt in synthetic have 68 to 70% performance of 7700xt in games 1440P have 66% mainly due x8 PCIE. and in 1080P 70.5% Techpower reviews

From this analysis we can confirm that RTX 5060Ti 16Gb will lose to RX 9060 XT 16GB.
In Techpower reviews RTX 5060Ti 16Gb have 60% Performance of RX 9070XT and I predict that RX 9060 XT 16GB will have 65-68% Performance of RX 9070XT in their review 1440P.
 
I don't know why Zotac can't put this cooler on the 16GB 5060Ti, there is thermal headroom to spare

Same story with 4060Ti, 8GB versions have single fan coolers but 16GB doesn't...does it take an extra fan to cool 8GB extra of VRAM chips?

Why is this review so much more positive than the ones on youtube? Honest question....I haven't had the hour necessary to compare numbers, but if somebody has and would like to provide their take on it.....

Kudos to Nvidia for making what AMD CAN'T.
The last time AMD made such a small, but not dog crap slow GPU was Radeon R9 Nano.

1748456960588.png
1748456987023.png
 
Die sizes are shrinking because nodes get smaller

Nodes are constantly getting smaller and yet historically die sizes of each tier of GPU haven generally gone up or down within a range that was normalized for each tier.

It's only recent Nvidia GPU generations wherein die size has gone drastically under the typical die size for each tier, excluding of course the 5090. The 5090 of which also breaks your logic here as the die size went up. Mind you it certainly isn't the only GPU to see it's die size increase.

and Nvidia has chosen to prioritize cost and power consumption instead of performance gains in this segment.

Surely you jest given the 5080 and 5090's power consumption.

Architecturally speaking Blackwell provides margin of error improvement to power consumption (for all intents and purposes, 0) so we know for a fact Nvidia did not prioritize power consumption.

The low power consumption of Nvidia's mid and low end cards is the result of the small die sizes, because you are getting physically less. It's akin to calling a mini kitkat healthy because it has 1/8th the calories of the regular sized candy.

These '8GB in 2025 cards' need a side-by-side image quality comparison using those cool slider and settings pictures you guys use, which are much more telling than a written description. Some Youtube content is out there showing this, and even at reduced settings the differences can be striking.

Yeah, the numbers are misleading as the actual visual quality takes a hit on 8GB cards in some titles regardless of if the settings are the same.

They're also forgetting, or deliberately ignoring, the fact that anyone who's interested in the 8GB offerings are not running 4k. Some of them might be 1440p but most will be running 1080p and more modest settings than "Max/Ultra".

HWUB has show that even at 1080p 8GB is not enough. Even in the scenarios where performance doesn't take a dive, visual quality may take a hit. That's something not addressed in this review.
 
And they do, by offering you both the most valuable cards, the fastest cards for every single task available (gaming, cuda, AI, you name it), and the cheapest cards with high vram.. At some point their competition needs to also start incentivizing cause this is embarrassing. Nvidia leads on every metric that matters, amd leads on amount of deranged fans on the internet.
Intel B570 is cheaper and has 10GB. RTX 3060 is cheaper and has 12GB. 8GB is high?, no.
People, wait for the 12GB version.
It's interesting why there is no high power consumption in idle in the minuses?.
Why is there no e.g. rx7600xt (16GB!) in the table, which supposedly draws 4W in idle? (sapphire), and 5060 as much as 9W. But there are stupidities like pciex5.
 
Last edited by a moderator:
Just a little tip, Nvidia is 90% there to make money.

Yall complain about prices, yet basically prices were locked for over 20 years. They just got with the times, much like game makers now charging well over the standard 50-60 bucks.

We are just a drop in an enormous pond. Nvidia has much better things to make for cars, Ai, etc. Knowing what they get for those products, with much higher demand than gpus, I feel blessed they don't just walk away from us.
 
Surely you jest given the 5080 and 5090's power consumption.

Architecturally speaking Blackwell provides margin of error improvement to power consumption (for all intents and purposes, 0) so we know for a fact Nvidia did not prioritize power consumption.

The low power consumption of Nvidia's mid and low end cards is the result of the small die sizes, because you are getting physically less. It's akin to calling a mini kitkat healthy because it has 1/8th the calories of the regular sized candy.
Why for the love of god do we constantly have to make stuff up and reinvent the wheel? It took 5 seconds to look into the review and prove otherwise. It seems nvidia does prioritize power consumption, that's why out of the 13 most efficient cards for gaming, 12 are from nvidia. What the heck are you guys thinking when making claims like that? I don't get it. That 5080 you called out is sitting on the top of the freaking chart my man...what's your next claim, that nvidia is lagging behind in Path tracing?


energy-efficiency.png
 
Just a little tip, Nvidia is 90% there to make money.

Yall complain about prices, yet basically prices were locked for over 20 years. They just got with the times, much like game makers now charging well over the standard 50-60 bucks.

We are just a drop in an enormous pond. Nvidia has much better things to make for cars, Ai, etc. Knowing what they get for those products, with much higher demand than gpus, I feel blessed they don't just walk away from us.
They have to stayed diversified. I mean if something happens with Ai governments could ban it. Or ban uses of it. It is still in untested waters. They are even still trying to figure out how to deal with the fact that most models have literally pirated millions of copyrighted works. Art, music, movies etc.

Even then copyright laws using Ai are still in their infancy. Not allowing alot of Ai use to be copyrighted from the get go.

The loopholes and what not that could happen from this can't be seen yet. Imagine Sony or Nintendo had an employee use Ai for a bunch of code to secure the firmware or something and they get hacked. They go to sue and it is deemed that they used Ai and their lawsuits to stop it aren't successful.

Sure a lot of hypotheticals but still Nvidia has to stay diversified. If it wasn't profitable they wouldn't stay in the game. They are making the Switch 2 GPU. I am sure their numbers for Prebuilt computers are pretty big as well. It all just stinks that they are such a crummy company with anti consumer sentiment and think everyone should just bow the knee.

I would also imagine they can turn a bunch of parts that don't meet high end specs and those are turned into other products down the product stack. They also keep cutting corners on consumer grade products like on memory bandwidth. I mean a GTX 970 that sold for what like $330 or something like that had a 256 bit memory interface. Now a $550 RTX 5070 card a successor in the xx70 line is $200+ more expensive and has a 192 bit memory interface.
 
Just a little tip, Nvidia is 90% there to make money.

Yall complain about prices, yet basically prices were locked for over 20 years. They just got with the times, much like game makers now charging well over the standard 50-60 bucks.

We are just a drop in an enormous pond. Nvidia has much better things to make for cars, Ai, etc. Knowing what they get for those products, with much higher demand than gpus, I feel blessed they don't just walk away from us.

Yep, and that is 99% of the reason everything below the 5090 is stagnating to the point even the 4070 super a refresh was more impressive than the rest of the 50 series vs what they replaced.

Comically this card got a bigger uplift vs the 4060 than the 5080 vs 4080 but still has worse price to performance than both the 7600XT/4060 and i believe even the 4060ti one of the worst Nvidia gpus of the last decade from a generational improvement standpoint.

#Goodtimes

Beyond that the competition has been mostly a joke but the 2025 gpu market is so bad even AMD is selling above msrp which is a first outside of a crypto boom at least in the last 15 years.

Still I don't really blame Nvidia if this is the effort they have to make to release a card and their fanboys will die on a hill defending them why do better.

In the RT Era we've went from the 60 class matching the previous generations 80 class to it not even being able to jump past one tier in performance #progress


Part of me hopes that the Nvidia Kool-aid drinkers are only hyping this card becuase they love trolling the red Kool-aid drinkers more than they care if Nvidia releases good products. Otherwise damn, never realized 4090/5090 owners were so easy to please....
 
Since Hardware Unboxed started showing in videos that for the GPUs with lower than needed VRAM the textures don't load properly and they are blurry and low quality, it is a MUST that reviewers should check if that happens and show that in the charts next to the FPS number in order for the buyers to know if the GPU is good enough or not to play properly the specific game.

i don't think they started to show anything shocking, gpus worked like that since forever. They just present scenarios that are often unrealistic, by their own standards, like using ultra settings when they themselves made a video saying it was stupid to do so.

You know a stupid person is often too stupid to realise they are stupid. I think this is the phenomenon we're facing here.

Just a little tip, Nvidia is 90% there to make money.

what are the remaining 10% for? they never gave away any product for free to consumer that i'm aware. I'm pretty sure it's 100% like any other company on the planet.
 
Low quality post by oxrufiioxo
what are the remaining 10% for? they never gave away any product for free to consumer that i'm aware. I'm pretty sure it's 100% like any other company on the planet.

99.9999999% they do occasionally give away gpu's lol like this Jensen signed limited edition cards....
 
It's interesting why there is no high power consumption in idle in the minuses?.
Why is there no e.g. rx7600xt (16GB!) in the table, which supposedly draws 4W in idle? (sapphire), and 5060 as much as 9W. But there are stupidities like pciex5.
The review was written for nvidia and it shows.

Well, these idle power consumption numbers are unrealistic and a result of systematic measurement error.
There is no way this little card draws 9W in idle, more likely 0.9W.
 
Here is the thing. The reason we are still gaming at 1080p at all is the Graphics Cards company.
I disagree entirely. 1080p is a sweetspot resolution that is affordable. The video card companies have nothing to do with it. I'm not going into that further.
 
Back
Top