• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's RTX 5060 "Blackwell" Laptop GPU Comes with 8 GB of GDDR7 Memory Running at 28 Gbps, 25 W Lower TGP

Nah, it’s alive, just in a different form. It’s basically, and forgive me if I am getting this wrong, is now a low power mode that ALL laptop GPUs possess instead of being a separate GPU SKU. And also a shit-ton of other “power saving” tech. Even NV seems to basically have no idea what it is anymore. But it is “AI powered”, so you know it’s good, right?

Yeah they just have TDP ranges now so not only do you have to look at the specific gpu but also the power the laptop allows it to use.... Makes buying the higher end stuff annoying becuase a 4080 laptop can perform drastically different depending on TDP configuration.
 
Sorry? Does 1080p on a laptop use less vram then 1080p on desktops? Because techspot has already demonstrated games suffering performance issues at 1080p with 8gb.

Since when exactly is 8 GB a problem with a 1080p display, I must have missed something
 
Since when exactly is 8 GB a problem with a 1080p display, I must have missed something
Because HUB/Techspot clearly demonstrated that if you specifically craft a test to overload 8GB of VRAM, you can!
 
Because HUB/Techspot clearly demonstrated that if you specifically craft a test to overload 8GB of VRAM, you can!

Also there are more than just performance issues in some games lods and texture quality are automatically downgraded on an 8GB gpu.

Even black myth wukong which looks fantastic has poor texture quality if you look too closely even at max setting a likely concession to stay within 8GB.

My issue with 8GB cards at the entry level has nothing to do with performance or what settings someone has to use it's stagnation for almost a decade in the 300ish price range and below and with Nvdia the 400 usd price range and below...
 
Because HUB/Techspot clearly demonstrated that if you specifically craft a test to overload 8GB of VRAM, you can!

I mean I get 8 GB isn't enough to go full ultra even at 1440p, but for 1080, 8 GB is just fine

Also there are more than just performance issues in some games lods and texture quality are automatically downgraded on an 8GB gpu.

Even black myth wukong which looks fantastic has poor texture quality if you look too closely even at max setting a likely concession to stay within 8GB.

My issue with 8GB cards at the entry level has nothing to do with performance or what settings someone has to use it's stagnation for almost a decade in the 300ish price range and below and with Nvdia the 400 usd price range and below...

Wukong on medium settings 1080p at full resolution scale (so true 1080p render) will fit in 6 GB, you saw my post on the benchmark thread. Reasonable to ask medium settings out of a card such as the 1070 Ti, which this 5060 will no doubt wipe the floor with
 
I mean I get 8 GB isn't enough to go full ultra even at 1440p, but for 1080, 8 GB is just fine



Wukong on medium settings 1080p at full resolution scale (so true 1080p render) will fit in 6 GB, you saw my post on the benchmark thread. Reasonable to ask medium settings out of a card such as the 1070 Ti, which this 5060 will no doubt wipe the floor with

What I was saying is even at the cinematic preset texture quality isn't very good on close inspection so im not surprised it has modest vram requirements but games in general have had meh texture quality in general.

I feel this is a side effect of how much stagnation nvidia has done with vram at the low to midrange...

Thankfully mods like the Witcher 3 HD rework and CP2077 HD rework are saving the day.
 
The main issue is there is no oof. 3060 didn't make 2060 look garbage. Neither did 4060. Similar price, similar performance. I don't like it and I don't buy it, we all should thank enthusiasts for their upgrade itch which makes it viable to sell stuff for a lot of money. 4070M level performance doesn't sound impressive, unless we're talking extremely cheap laptops ($750 or lower).

8 GB is a problem but lack of bang per buck improvement is a much bigger problem.
 
What I was saying is even at the cinematic preset texture quality isn't very good on close inspection so im not surprised it has modest vram requirements but games in general have had meh texture quality in general.
Honestly, it’s a bit hard to say nowadays if it’s the texture quality being low or the over-abundance of post-processing and temporal AA making everything LOOK like it is. For what it’s worth, my friend works in 3D modeling and regularly looks at models from current games and his opinion is that the textures are mostly fine, decently high-res (his rants about shit alpha-channels aside since everyone just relies on TAA to hide dithering), it’s just that developers just rely on engines to do a LOT nowadays automatically and it often looks wrong.
 
The main issue is there is no oof. 3060 didn't make 2060 look garbage. Neither did 4060. Similar price, similar performance. I don't like it and I don't buy it, we all should thank enthusiasts for their upgrade itch which makes it viable to sell stuff for a lot of money. 4070M level performance doesn't sound impressive, unless we're talking extremely cheap laptops ($750 or lower).

8 GB is a problem but lack of bang per buck improvement is a much bigger problem.

For sure the lack of progress at the 60/60ti range has been troubling.

Honestly, it’s a bit hard to say nowadays if it’s the texture quality being low or the over-abundance of post-processing and temporal AA making everything LOOK like it is. For what it’s worth, my friend works in 3D modeling and regularly looks at models from current games and his opinion is that the textures are mostly fine, decently high-res (his rants about shit alpha-channels aside since everyone just relies on TAA to hide dithering), it’s just that developers just rely on engines to do a LOT nowadays automatically and it often looks wrong.

True but mods can and do improve it substantially and my guess is with game sizes already ballooning out of control also being a factor but an optional update would be nice for higher quality/higher resolution textures.
 
Even black myth wukong which looks fantastic has poor texture quality if you look too closely even at max setting a likely concession to stay within 8GB.
Is that your opinion of what's happening or confirmed to be whats happening? From all the content I've seen it has nothing to do with VRAM pool size, as some textures are repeatably poor quality on cards with 12+GB.

Overall I agree on the pricing though, it'd be nice to see more VRAM across the entire stack, but on this SKU I don't see it as a necessity. I also don't know what to even try to believe at this point, we usually get high end desktop parts well before mobile, it seems odd to get a leak about a mobile 5060 at this point in the pre-release leaks.
I mean I get 8 GB isn't enough to go full ultra even at 1440p, but for 1080, 8 GB is just fine
Oh I agree, it might not be the GPU that appeals to me the most but I'd wager it's fine. It was more a dig at HUB/Techspot and their testing they do to highlight problems, testing with methodology specifically constructed to prove their supposition correct (at least in one or more occasions).
 
I mean I get 8 GB isn't enough to go full ultra even at 1440p, but for 1080, 8 GB is just fine
I mean why does JHH need to be so greedy? Isn't that $3 trillion market cap not enough :shadedshu:

Rhetorical question btw :ohwell:

Oh I agree, it might not be the GPU that appeals to me the most but I'd wager it's fine.
So if you want to run custom ML models you now need to pay more for VRAM as well :wtf:
 
Because HUB/Techspot clearly demonstrated that if you specifically craft a test to overload 8GB of VRAM, you can!
As in: "play games designed for consoles with 16GB of RAM". Such stringent testing!

IDK why you guys are so insistent on holding onto that 8GB lifeline. Console shave 16GB of RAM. 8GB is no longer a baseline. It's OK to move on. Buying a GPU with 8GB of RAM at this point is like buying a 2GB GPU in 2015. A pointless move that will likely backfire.

But I forgot, resident evil Village is a specifically crafted benchmark, nobody actually PLAYS it.
 
@oxrufiioxo
The ballooning sizes are also, IMO, pure laziness that stems from the trend started on previous console gens. Devs stopped compressing assets to save on CPU resources due to consoles having weak CPUs and they just kinda… kept going like this. Next COD is what, like 400 gigs for a full install? Shit’s ridiculous.
 
It's probably more do with Directstorage, because Sony/MS(?) have something which desktops still don't.
 
Since when exactly is 8 GB a problem with a 1080p display, I must have missed something
There are games, such as RE:Village and forespoken that, even at 1080p, experience issues with 8GB cards. Village had crashing issues and severe stuttering, Forespoken textures would straight up not load properly. Frame graphs looked fine, but anyone LOOKING at the gameplay could tell something was wrong.

I just really dont get it. These GPUS cost nearly as much as a console themselves, cannot play at settings consoles can, yet you still need the rest of the PC. Yeah, sure your use case may not demand it now, but when 512MB and 2GB cards went out to pasture there wasnt this constant demand that games be made to work on them, people understood that they were outdated, and it was time to move on. But with 8GB people get really hung up on it, like its some kind of insult to say 8GB just isnt enough anymore. When wolfenstein the new order came out and needed 3+ GB for the highest settings, people didnt cry about how it should fit fine in 2GB, they accepted that 2GB cards were coming to their end.

Any card over $200 should have more then 8GB of VRAM today. To do otherwise is an insult to the consumer. If you are buying these $400+ cards and running games at low settings to avoid running out of VRAM, something is seriously wrong.
 
It's probably more do with Directstorage, because Sony/MS(?) have something which desktops still don't.

So far direct storage has been a bust on PC with just lowered performance for negligible differences in loading etc. With Nixxes saying there are other better options on pc basically.
 
I meant Sony already had dedicated hardware for compression on the PS5, MS probably has something similar although in software. So consoles are much better than PC in that regard, as of now.
 
As in: "play games designed for consoles with 16GB of RAM". Such stringent testing!
It's my understanding consoles can't access 16GB for video memory, more like 10-12, and often targeting 4k output. And yeah so there are at most a handful of games that can't manage the maximum setting, and seemingly people are allergic to changing the texture setting down one, maybe two notches.

I absolutely agree there are games you can't max all the sliders at 1080p on an 8GB card, but the equation isn't as simple as that fact being true = 8GB is a wholly insufficient amount for certain SKU's.
IDK why you guys are so insistent on holding onto that 8GB lifeline.
I'm not, I wouldn't buy this myself, but I think the issue is also somewhat overblown, for an unreleased rumored low end mobile video card. personally, targeting 4k120 since my last GPU upgrade, I'll be after a 16+ GB card when I next upgrade.
 
I mean why does JHH need to be so greedy? Isn't that $3 trillion market cap not enough :shadedshu:

Rhetorical question btw :ohwell:


So if you want to run custom ML models you now need to pay more for VRAM as well :wtf:

Market Cap isn't wealth. It's just what investors think the price of stock should be at any given time.

The thing is that adding 12 GB VRAM to an entry level laptop GPU isn't going to put more money into Huang's pocket. It will probably take money out of his pocket. The added cost will just be passed down to consumers who are on a tight budget and already looking hard at the cost of the laptop. Some may choose not to buy it if it adds more than they are comfortable with spending.

On a tight budget and making compromises is a difficult subject to broach on a tech enthusiast site. It's not how most of us think but I can assure you that there a lot of gamers out there who do care very much about the budget especially at the level of this laptop.
 
The point is with all the AI money JHH is printing adding 4GB more to the GPU would cost peanuts, literally & figuratively, for Nvidia. Yet here we are, Nvidia is more like today's Apple in this instance!
 
So far direct storage has been a bust on PC with just lowered performance for negligible differences in loading etc. With Nixxes saying there are other better options on pc basically.

The PC potato race strikes again. Most gamers have really dated, poorly maintained hardware. The condition of the average gaming PC is really bad.

There are games, such as RE:Village and forespoken that, even at 1080p, experience issues with 8GB cards. Village had crashing issues and severe stuttering, Forespoken textures would straight up not load properly. Frame graphs looked fine, but anyone LOOKING at the gameplay could tell something was wrong.

Visual degradation in such engines is an anomaly, not the norm. But these symptoms emerge primarily when settings are pushed beyond the hardware's reasonable capabilities, IMO.
 
The point is with all the AI money JHH is printing adding 4GB more to the GPU would cost peanuts, literally & figuratively, for Nvidia. Yet here we are, Nvidia is more like today's Apple in this instance!
It would not be peanuts, actually. You’d need to add double to preserve the same memory bus or change that to a wider one to get 12. I suppose there are funky asymmetrical configs that are possible, but there is a reason they aren’t done often. So for a 128-bit bus it has to be either 8 or 16. And 16 gigs of brand new GDDR7 would probably be considered a pointless waste on an entry SKU.
 
  • Like
Reactions: 64K
I suppose there are funky asymmetrical configs that are possible, but there is a reason they aren’t done often.
The 48GB DDR5 on desktops is a recent & relevant example, so no GDDR7 can also do it but yes I did have the 128bit wide bus in mind & how that would also affect packaging? 16GB on a mobile xx60 chip isn't happening so we can all forget about that pipedream.
 
@R0H1T
VRAM has more issues than RAM with asymmetric configs. It’s stringent in that each GDDR package has a 32-bit connection. Anything like, say, getting 12 gigs on 128-bit bus is, while theoretically possible, will cause non-uniform performance for part of that memory. CPU IMCs are more flexible and can run, essentially, whatever (to certain extent), especially after a firmware update.
 
Back
Top