• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5060 8 GB

It seems to me or DF is promoting Nvidia ? the choice of games, Cyberpunk 2077 - Nvidia tech demo for years, Alan wake 2 - Nvidia heavily sponsored game and tweaked recently with RTX megageometry, Black Myth Wukong - practicly made for Nvidia, even Alex from DF joked how biased developers were in a podcast.

The potential list of games was shared ahead of time. I think that was your opportunity to make a case that the games list favors Nvidia cards.


W1zzard did say he plans on revisiting the list around this time of year, so now is also a good time to suggest specific games to add or remove.
 
That's on AMD. They failed to hop on the train, NV never hijacked it.

To be fair here Nvidia did make the bloody thing, it's kind of hard not to expect AMD to be playing catch up. It's not exactly something open source and even if it were it would still involve some work.
I guess they could have moved faster, idk, hard to say. As usual AMD is playing catch up to whatever new shinning thing Nvidia comes up with.
 
As usual AMD is playing catch up to whatever new shinning thing Nvidia comes up with.
NV try to be competitive. AMD only are repetitive. NV introduce game changers most of the time, not AMD.
 
That's just it, you people are NOT helping anyone. Users who are going to look at an 8GB card will understand going into the purchase that they need to manage their expectations and GFX settings. Budget Gaming 101. The continued naysaying nonsense is a waste of time, effort and patience.
The review under this very discussion falls under that same umbrella. It doesn't take 42 pages to conclude "turn down settings to manage your VRAM" but that wouldn't explain the full picture.

Wizard wrote in his conclusion to focus on getting playable FPS with this card and not worry about RT (which is a fine comment) yet he also commented that this card could manage RT better had it not been for only 8GB of VRAM. Wizard doesn't have to tell us the second part so why include it? Simple answer is to make the information be known. That's it. Even if you think it's unhelpful information, it's useful to someone.
So it turns out, it actually can't. So what's the issue...?
The chip can. The VRAM they've partnered it with can not.
And if you want to use RT buy a 16gb card.
At this point, yes. Since there's no changing the stack now but condemning the 5060 to the same settings as the 4060 was an interesting choice by NVIDIA. That's all. Let's see if they do it again in 2 years.

Oh and for the record, the 8GB version of the 9060 XT is just as weird of a product. Both companies can do better but choose not to.
 
Last edited:
That's another thought, is any 8gb GPU, GTA VI ready ? i bet you they aren't, buy 5060/9060 xt 8gb now and 2 years from now, damn, it doesn't work well in GTA VI, i need to upgrade my GPU, AMD and Nvidia - mission accomplished.
I would be shocked if Rockstar is not working hand-in-hand with Microsoft, Nvidia, and AMD to make sure GTA6 fully utilizes Neural Shaders to reduce the memory footprint of shaders and textures and make it playable on an 8GB GPU. Probably why GTA6 needs more time before a PC port as well. Also, the Rage engine is not VRAM hungry at all. Just look at the GTA 5 performance (https://www.techpowerup.com/review/gta-v-benchmark-performance-test/3.html) Even 2GB GPUs could run the game better than the 8GB PS4. Same in RDR2, 4GB GPUs with half the VRAM of a PS4 were perfectly happy, even with how detailed it was.
 
Last edited:
I would only be concerned if it ran out of memory at low texture settings. I'm still not convinced.
Well, you own a 2080ti. Granted that's 11GB, but it's still an older gen card. How is your performance? Have you watched your RAM usage?
but that wouldn't explain the full picture.
Yes, it would, that is the sum total of the situation. Low end card? Turn down the settings. It really is that simple. There's is NOTHING more complicated about it. That has been the reality of budget PC gaming for decades.
 
Last edited:
Raster? Yes. RTRT? Not so much.

While true, they still have a definitive lead in RTRT performance.
Sure, but that's a hardware lead - and that was not because it's nvidia sponsored since the 9070xt closed the massive gap that existed with rdna 3. I just hate it when people throw around the "nvidia sponsored" like it even means anything when nvidia sponsored games in fact run better on amd than the rest of the games, lol.
 
Almost 6 years later and barely faster than a 3060Ti, nGreedia has no shame, what a pathetic company.
 
It seems to me or DF is promoting Nvidia ? the choice of games, Cyberpunk 2077 - Nvidia tech demo for years, Alan wake 2 - Nvidia heavily sponsored game and tweaked recently with RTX megageometry, Black Myth Wukong - practicly made for Nvidia, even Alex from DF joked how biased developers were in a podcast.
In another video they used these games to compare 5060 to PS5, come on, we know PS5 GPU is low end by today standards but to use these games is just nasty.
I wonder how will they spin the "Nvidia GeForce RTX 5060 Review: Better Than PS5 GPU Perf - But 8GB Is Not Enough" review when Grand Theft Auto VI hits PC in 2027 or something like that, i don't know how that game will work on 8gb GPU, Nvidia needs to summon the mother of all AI to compress the shit out of those textures to fit in 8gb.
That's another thought, is any 8gb GPU, GTA VI ready ? i bet you they aren't, buy 5060/9060 xt 8gb now and 2 years from now, damn, it doesn't work well in GTA VI, i need to upgrade my GPU, AMD and Nvidia - mission accomplished.
I don't really care who they are promoting for the point of the video one way or another. The video was to show the place the 5060 stands. At least at the beginning half.

Back when GPU's actually made leaps in performance you expected at least a one gen jump each generation. So 3070 now as fast or faster then 2080 one gen to next. Then 4070 to be faster then 3080.

Now we barely see a performance uplift from 4060 to 5060. Unless of course you use DLSS. Well and you are in Windows because Nvidia doesn't play nice with Linux like AMD. Then you have to make sure your firmware and bios on your GPU is up to date because between that and bad drivers your games might be crashing. I guess the low end cards do benefit from not using the same power connectors as the larger cards in most cases. So at least maybe the 5060 is less likely to get your system to fry itself.

The last few generations have been bad for Nvidia. Between the power connectors and everything else. If the cards at the top of the stack weren't improving they would be more in line with Intel. Yet somehow they are actually having generational uplift on the high end but refuse to translate it to the lower end.
 
somehow they are actually having generational uplift on the high end but refuse to translate it to the lower end.
It's purely because those low end GPUs sell themselves, you don't need to market them or even make them better than last gen. People with limited budget desperate for a new GPU will buy whatever you drop at them (wouldn't be the case if AMD flooded the market with ridiculously good value offerings). However, higher tier GPUs are the enthusiast zone and these people are more nitpicky about what they buy so NV can't get away with so many counts of murder on this end.
 
The issue is that Lovelace was a terrible launch. Then Nvidia fixed with Super series some cards. Then they launched Blackwell and fixed the low end how it should had been. The core issue here is that we are on the same node. We never see great performance jumps just from architectural changes and companies often release marginally better products at lower price. We had such meh generations way in the past too. So it is understandable why progress was so slow. Chip shortage, TCSM monopoly and all that.

However, if it is so easy to do better, why B580 and most likely RX 9060 don't outperform it? The issue is systematic rather than one company simply not giving you stuff for no reason. The good news that situation is likely to improve with time. Manufacturers are investing heavily in expanding chip production. Intel is trying to become a player again and their 18A shows promise if it won't get delayed again. Then next generation will finally go to 3 nm as demand for it subsides and node matures. We are going to see proper gains due to that alone. Then GDDR7 chips had received 3 GB modules and they are becoming more available. So going further Nvidia GPUs are likely to have more VRAM. I have no doubt that tech youtubers are going to celebrate as their victory rather than it starting to make sense from manufacturing side of things.

It's purely because those low end GPUs sell themselves, you don't need to market them or even make them better than last gen. People with limited budget desperate for a new GPU will buy whatever you drop at them (wouldn't be the case if AMD flooded the market with ridiculously good value offerings). However, higher tier GPUs are the enthusiast zone and these people are more nitpicky about what they buy so NV can't get away with so many counts of murder on this end.

Tech media is often out of touch with average gamers. The thing with people is that they don't want to spend a lot on GPU and they don't need all that performance. It is not that they don't have money even, it is that their needs aren't that great and people naturally want to spend the least amount of money. My both friends are fully satisfied by budget class (RTX 3060 12 GB and RX 7600) GPUs. Years since purchase they never played anything which would tax them. Likewise, all those 50s card class which youtubers love to pile on have a real purpose. RTX 3050 6 GB for example is a terrific card, because it offers massive performance gains for builds without power connector. It can be fueled solely from motherboard and have low profile models. Previously we had way worse GPUs for that purpose. Likewise, there is a good reason why I ended up with RX 6500 XT out of all things. Each tier has its own requirements and their buyers. It is not as simple as 'get better FPS/dollar'.
 
Last edited:
RTX 3050 6 GB for example is a terrific card
Sure but we got no Blackwell or even Ada card for this purpose. And AMD offers even worse stuff in this segment!
 
That's just it, you people are NOT helping anyone.
That's your opinion.
Journalists don't complain about bad companies not being fair, they do their fuc**** job.
Complaining about bad companies not being fair is literally their job.
I would only be concerned if it ran out of memory at low texture settings. I'm still not convinced. Medium, high and ultra settings on the 5060 is asking too much.
We being real right now? You expect people to buy a $300 GPU to play games on 1080p LOW?
However, if it is so easy to do better, why B580 and most likely RX 9060 don't outperform it?
The B580 is still Intel's second gen and does outperform it value-wise if both are at MSRP. Intel's practically selling it at cost or at a loss. The 9060 will likely outperform it though.
 
The potential list of games was shared ahead of time. I think that was your opportunity to make a case that the games list favors Nvidia cards.
What are you talking about ? i said DF as in DIGITAL FOUNDRY , not techpowerup
Fun fact, cyberpunk works better on AMD gpus than the average game does. Proof that nvidia is actually playing fair.
Please man, please leave me alone!!! Nvidia playing fair :laugh: :laugh: Path tracing ? ray tracing ? no FSR4 in cyberpunk ? not even FSR 3.1 so they can't enable it
I would be shocked if Rockstar is not working hand-in-hand with Microsoft, Nvidia, and AMD to make sure GTA6 fully utilizes Neural Shaders to reduce the memory footprint of shaders and textures and make it playable on an 8GB GPU.
Really ? You do know GTA VI is CONSOLE ONLY for now and the gpu on xbox and ps5 have no tensor cores, they can't even do machine learning upscaling except PS5 pro but GTA VI targets PS5 and Xbox, it's RNDA2 rx 6700
 
Sure but we got no Blackwell or even Ada card for this purpose. And AMD offers even worse stuff in this segment!

Yes, I'm excited about RTX 5050 release (we just got news about laptop version). I don't think that we will get low power version of it. On desktop DIY those cards are usually less relevant, but they are very much appreciated on laptop market. I had recommended laptops multiple times to people and they often land in RTX XX50 gaming range or they need a good CPU for slightly less. Such laptop also saved my skin when I was a poor Erasmus student and my laptop died. Found a deal with dedicated GPU for a great price. All other options would had gotten me just i5 at best, but often it were just i3. It served me well and still enables my sister to play classic World of Warcraft on rare occasion.
 
- 3GB Chips would get it to 12GB no problem. Knowing NV they'd probably go with a 3+3+2+2 set-up to keep costs down but get it to a more comfortable 10GB pool.
I don’t doubt that Nvidia would have the courage to launch a 5060S 9GB with a 96-bit bus and slightly faster GDDR7 memory to compensate. They don't care about negative marketing, it seems. :p
 
Please man, please leave me alone!!! Nvidiaplaying fair :laugh: :laugh: Path tracing ? ray tracing ? no FSR4 in cyberpunk ? not even FSR 3.1 so they can't enable it
The moment you were presented with some actual data that directly disprove your theory you changed the subject and went on a tangent about upscalers. This tells me that you are not really interested in a discussion, you are just trying to find reasons to complain about nvidia. Keep it up i guess.
 
Really ? You do know GTA VI is CONSOLE ONLY for now and the gpu on xbox and ps5 have no tensor cores, they can't even do machine learning upscaling except PS5 pro but GTA VI targets PS5 and Xbox, it's RNDA2 rx 6700
You can't read. The very next sentence is "Probably why GTA6 needs more time before a PC port as well."

They'll probably rely on texture decompression on consoles, and neural rendering on PC (maybe DirectStorage as well).
 
So you've tested this on your own bench? Or at least can link to someone else's measurements? Let's keep the level of discourse here high, and not muddy things with baseless accusations.



There's so little respect these days for expertise. Anti-intellectualism is celebrated at all levels of society. Eager and uncritical belief in conspiracies is on the rise. What can I do to improve the situation, other than keep replying every time someone is wrong on the Internet?

First time I hear about this. Where do you think is my systematic measurement error?

I use calibrated lab test equipment to measure current and voltage going across the slot, and current and voltage across the 8-pin, then P = U * I

Not sure what could be done differently? Or are you looking at the software reported power in GPU-Z hope that might be reality?

He's been in his own reality since he joined this forum.....

Both Radeon Software and GPU-Z report 3W "idle" power consumption on the Radeon RX 9070.
And this is not the lowest deep sleep power state.

I have proved it, but no one sees it.

I think the idle power consumption is 2 or 3 watts, not 11 as claimed here.

View attachment 398441


 
I agree it's not a great deal. (I have the same model.) But the 3070 has a x16 PCIe interface, so if your graphics card slot is PCIe 3.0 then it's not the worst choice. You won't want to use the x8 4060/5060 cards on PCIe 3.0.

What's your motherboard?
I have an ASRock B550 mobo. Would it be wise to try to sell this off and maybe get the 5060? It’s only 10$ more than what I paid for the 3070
 
Both Radeon Software and GPU-Z report 3W "idle" power consumption on the Radeon RX 9070.
And this is not the lowest deep sleep power state.

I have proved it, but no one sees it.







You do realize software readings aren't even remotely accurate right that it's basically just a guess reported by the driver.

I'm sure W1z can explain it better.
 
Back
Top