• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Launches GeForce RTX 5060 Series, Beginning with RTX 5060 Ti This Week

LTT's just dropped a banger of a video calling out the shady manipulation of the press and ridiculous launch cycle, something HUB touched on yesterday.

LInus has basically joined the other Linus is screaming "F*** you, Nvidia" as loudly and blatantly as he can.
You know you done messed up when Linus, the guy who admittedly says he tries to be positive about products and find the good where others wouldn't (personally because I believe he can sometimes be more entertaining youtuber than serious reviewer) and even he says he has very little to nothing to be positive about with the 5060Ti 16GB, and he rightfully called out the weirdness with the 8GB cards launches.

I like Linus and his company (well, not his anymore, but ykwim), they aren't idiots, and it seems ever since GN made that one video about them ages ago they've realized (whether they wanted to admit it to Steve or not) that they weren't going about their reviews the right way. This is a refreshing change.

Honestly, all their 50 series reviews have been bangers. No sugarcoating or too much entertainment fluff, just still, somewhat silly but serious reviews. :)
 
Last edited:
LTT's just dropped a banger of a video calling out the shady manipulation of the press and ridiculous launch cycle, something HUB touched on yesterday.

LInus has basically joined the other Linus is screaming "F*** you, Nvidia" as loudly and blatantly as he can.

Why should I care what clickbait YouTubers say?
 
LTT's just dropped a banger of a video calling out the shady manipulation of the press and ridiculous launch cycle, something HUB touched on yesterday.

LInus has basically joined the other Linus is screaming "F*** you, Nvidia" as loudly and blatantly as he can.

That doesn't make them right. It's just makes them look like amateurs that can't tell the backside from a hole in the ground.
Scratch that, misunderstood context there. Just watched the video.

he rightfully called out the weirdness with the 8GB cards launches.
What's right about it?

The benchmarks overwhelmingly show that this card isn't doing much better than a 3070 8GB. So honestly, where is the advantage of 16GB VS 8GB(hint there isn't one).
My 3080 is STILL beating out this card two full generations later.. The VRAM is not the issue.

I really wish people would quit parroting this "8GB is not enough" nonsense.
 
Last edited:
What's right about it?

The benchmarks overwhelmingly show that this card isn't doing much better than a 3070 8GB. So honestly, where is the advantage of 16GB VS 8GB(hint there isn't one).
That's not what I'm referring too. I think you're missing the point.. nobody is actually talking about the VRAM differences for performance reasons (well, some are, but I think a lot of people are realizing that 8GB is still just fine) What they're actually talking about is how NVIDIA is only giving reviewers the 16GB models for review (atleast, as far as reviewers have stated), and not letting them get their hands on 8GB ones unless they purchase it themselves. If the statement is correct.

Sure, I don't think the performance difference is that noticeable (there's no reason to suggest there is one between both models anyway, not a 3050 6GB situation which wasn't even the VRAM's fault.), but its still shady. And weird. And that's what Linus is calling out (and he's not the only person calling this out, too, by the way, he's just one of the bigger voices.)

You'd think NVIDIA would let partners give reviewers the 8GB models too but they seemingly told them not too. Who knows why. But I (and many other people) are bothered by how weirdly shady it is. Luckily this isn't a 3050 6GB situation where the lower VRAM card is severely cutdown sneakily and stealth launched. But its just something that NVIDIA really has no reason to be doing.
 
That's not what I'm referring too. I think you're missing the point.. nobody is actually talking about the VRAM differences for performance reasons (well, some are, but I think a lot of people are realizing that 8GB is still just fine)
Ah, yes I did miss your point. Sorry about that. I saw that one sentence and jumped in there.
What they're actually talking about is how NVIDIA is only giving reviewers the 16GB models for review (atleast, as far as reviewers have stated), and not letting them get their hands on 8GB ones unless they purchase it themselves. If the statement is correct.
Ah, that's a different thing. I haven't watch the LTT video. It seemed like they were parroting the same train of though that JayzTwoCents has been on about.
Edit: Just watched it. I hate to say it, but Linus made a few good points there.. Wow.. Didn't think I'd be saying THAT...

NVidia is really messing the bed on the whole of the 5000 series launch.
 
Last edited:
There is obviously a reason why they are being weird about the 8GB reviews. Maybe they don’t want people to see that 16GB isn’t needed in this space, so they don’t want holdouts. Maybe the 8GB model will be supply limited, but they can still say the 5060Ti “starts at less than $400.” It’s a weird thing to be weird about. Will reviewers even be able to get the 8GB card without the special access?
 
Ah, yes I did miss your point. Sorry about that. I saw that one sentence and jumped in there.
It's alright, man. :)
Ah, that's a different thing. I haven't watch the LTT video. It seemed like they were parroting the same train of though that JayzTwoCents has been on about.
JayzTwoCents, GN, various other ones, though it does contain a bit of their own opinion. I'm still pretty sure LTT is within the '8GB is fine but more would be nice" category regarding VRAM so they didn't mention anything about VRAM performance differences for that reason, besides the fact they don't have benchmarks yet.
 
JayzTwoCents, GN, various other ones, though it does contain a bit of their own opinion. I'm still pretty sure LTT is within the '8GB is fine but more would be nice" category regarding VRAM so they didn't mention anything about VRAM performance differences for that reason, besides the fact they don't have benchmarks yet.
Here the kicker. If you look at the new reviews, the 4060ti 8GB is neck & neck with the 4060ti 16GB.
If you look at the graphs, only one of them show the 16GB version having anything more than a sliver of an advantage.
So LTT is in the camp of folks that have objectively looked at factual data and concluded properly then good on them.
 
Last edited:
Here the kicker. If you look at the new reviews, the 4060ti 8GB is neck & neck with the 4060ti 16GB.
If you look at the graphs, not one of them show the 16GB version having anything more than a sliver of an advantage.
Yea but what I mean is that like, "8GB is 'fine' but we'd would like more just incase" it kind of where they seem to land. It seems that's where the more moderate opinion rests, and LTT seems to be in that camp seemingly. They didn't have a problem with the VRAM with 30 series cards, despite the fact they would mention in the RDNA2 reviews that RDNA2's advantage over Ampere was the VRAM back then.
So LTT is in the camp of folks that have objectively looked at factual data and concluded properly then good on them.
I mean they take their own data so I'm sure they can see it themselves too.
 
Here the kicker. If you look at the new reviews, the 4060ti 8GB is neck & neck with the 4060ti 16GB.
If you look at the graphs, not one of them show the 16GB version having anything more than a sliver of an advantage.
So LTT is in the camp of folks that have objectively looked at factual data and concluded properly then good on them.

It's significant in some situations.

the-last-of-us-pt-1-1920-1080.png



To save you watching the video, here is what happens when it runs out of VRAM, especially on PCIe 3.0 systems which a lot of people who buy cards in this class will still be using. It affects PCIe 4.0 as well, just not always as much. Bit more than just a sliver of difference eh? The 8GB card is brought to its knees in situations the 16GB card isn't breaking a sweat.

1.jpg


2.jpg


3.jpg



4.jpg



5.jpg


6.jpg
 
Last edited by a moderator:
Why should I care what clickbait YouTubers say?
Because clickbait or not, LTT is still advocating for transparency and for GPU manufacturers to be honest.

Whatever their agenda, keeping Nvidia in check is an important role of journalists. If Nvidia had their way we'd all be paying $1000 for a 96-bit RTX 5030 with 4GB VRAM and Nvidia's low-texture AI upsampling. We need independent reviewers to discourage rampant false advertising and misleading information. A lot of what Nvidia have published as claims for the 50-series are borderline illegal in many countries outside the US. False advertising and misrepresetation even make headlines in the US because if it's blatant enough there will be lawsuits.

Market competition is good for consumers. An unfair playing field because shady business practices and marketing lies undermine the competitive market playing field is bad for you and me, without exception.
 
It's significant in some situations.
Nice cherry pick. That's why I showed the averages page. Outliers are the exception, not the rule.

Also, I'm not responding to the rest of your post as I don't feel the need spend the next 3 hours of my time explaining why Steve Walton is full of crap and his testing methodology is so deeply flawed. IF everyone was getting the same numbers he's getting, he would have merit. But that's not happening and this is not the first time the numbers he produces have been wildly out of wack. HUB is NOT a reliable source of information. Stop quoting them.

Because clickbait or not, LTT is still advocating for transparency and for GPU manufacturers to be honest.
In this situation, yeah, that's what he's doing.
Whatever their agenda, keeping Nvidia in check is an important role of journalists.
Also true! It's absolutely vital!
 
Nice cherry pick. That's why I showed the averages page. Outliers are the exception, not the rule.

Also, I'm not responding to the rest of your post as I don't feel the need spend the next 3 hours of my time explaining why Steve Walton is full of crap and his testing methodology is so deeply flawed. IF everyone was getting the same numbers he's getting, he would have merit. But that's not happening and this is not the first time the numbers he produces have been wildly out of wack. HUB is NOT a reliable source of information. Stop quoting them.

To be fair, you said not one graph shows the 16GB card being any better. That's misleading isn't it?

Multiple different websites and YouTubers have produced similar results of 8GB cards collapsing in performance when VRAM is exceeded, where the 16GB card sails along unaffected. Yes, people can turn down some settings and mitigate the problem, but on a 16GB card you don't have to - that's the point. Pretending the two cards are all but identical in performance with nary a sliver between them is false.
 
  • Like
Reactions: SRS
To be fair, you said not one graph shows the 16GB card being any better. That's misleading isn't it?
Corrected. Satisfied?
Multiple different websites and YouTubers have produced similar results of 8GB cards collapsing in performance when VRAM is exceeded, where the 16GB card sails along unaffected. Yes, people can turn down some settings and mitigate the problem, but on a 16GB card you don't have to - that's the point. Pretending the two cards are all but identical in performance with nary a sliver between them is false.
I've done testing myself and none of the games I tested showed any differences. So no, NOT false. But you feel free to believe whatever you like.
 
It's significant in some situations.

the-last-of-us-pt-1-1920-1080.png



To save you watching the video, here is what happens when it runs out of VRAM, especially on PCIe 3.0 systems which a lot of people who buy cards in this class will still be using. It affects PCIe 4.0 as well, just not always as much. Bit more than just a sliver of difference eh? The 8GB card is brought to its knees in situations the 16GB card isn't breaking a sweat.

View attachment 395356

View attachment 395357

View attachment 395358


View attachment 395359


View attachment 395360

View attachment 395361
Some strange results considering the VRAM usage isn't tapped out. If you look at the system RAM usage, the 8GB cards are using a lot more memory in most of those examples.
 
Some strange results considering the VRAM usage isn't tapped out. If you look at the system RAM usage, the 8GB cards are using a lot more memory in most of those examples.

Presumably the implication is the 8GB card is swapping to system memory which brings it down while the 16GB card doesn't have to? I believe this is why porting console games can be challenging (one of the reasons), because consoles have access to a huge pool of shared fast memory which at least on the PS5 is all the same speed.
 
which brings it down while the 16GB card doesn't have to?
Except that's not what is happening in the real world. HUB results don't match everyone's results. That ONE game you pointed out(not a game I play which why I missed it) is the only exception to the rule of the 8GB VS 16 GB being a sliver away from each other in performance.
because consoles have access to a huge pool of shared fast memory which at least on the PS5 is all the same speed.
Also incorrect. The PS5 has 16GB of total RAM, that memory pool needs to be shared with the rest of the system resources(CPU, IO, game base code, etc) and that can never end well for VRAM usage.
 
Also incorrect. The PS5 have 16GB of total RAM, that memory pool needs to be shared with the rest of the system resources and that can never end well for VRAM usage.

That is what I meant. There is no slow and fast memory for the developer to worry about. It's all just fast.
 
Ok, how then does that tie into this discussion?

No need to be hostile man, seems obvious enough. The 8GB card chokes sometimes when cards with more memory don't.

Indiana Jones is another one.

 
  • Like
Reactions: SRS
No need to be hostile man, seems obvious enough.
I'm not being hostile, if I were, you would have zero doubt of it. You're sidestepping the issue. Please explain how the PS5 RAM scheme has anything to do with this discussion, I'm genuinely curious.
The 8GB card chokes sometimes when cards with more memory don't.
Not from what I've seen. TPU benchmarks don't show that and anyone with credibility doesn't either.
Example, someone who's testing is actually trustworthy;
This gentleman shows the numbers and the settings used realtime.
There are very little differences between 8GB and 16GB in 1080p or 1440p.

I'll take TPU and JegsTV's numbers over HUB all day, any day, every day.
 
Last edited:
I would guess what OP means is that with unified/shared memory, there is no loss (speed or latency) when swapping between system RAM and VRAM on a PS5 because it's all one bucket on the same bus. When a GPU on a PC runs out of VRAM, it takes a hit going to system RAM, or it chokes completely until settings are adjusted to keep it from capping out. Yes, consoles only have 16GB to work with, but that's a different issue. So long as the console can efficiently and dynamically use the RAM it has, then there's the difference in terms of this discussion.

EDIT: It also helps that modern consoles have NVME storage, which should theoretically help offset a lack of system RAM if there's a need to hit a swap file. It's not ideal, but 5500MB/s is way better than the previous gen's SATA spinners.
 
I would guess what OP means is that with unified/shared memory, there is no loss (speed or latency) when swapping between system RAM and VRAM on a PS5 because it's all one bucket on the same bus. When a GPU on a PC runs out of VRAM, it takes a hit going to system RAM, or it chokes completely until settings are adjusted to keep it from capping out. Yes, consoles only have 16GB to work with, but that's a different issue. So long as the console can efficiently and dynamically use the RAM it has, then there's the difference in terms of this discussion.
I can see that. The problem with that is there are only a few(2) examples where the differences are meaningful. They are otherwise not relevant to 1080p or 1440p gaming. 2160p or the ultra wide resolutions, ok yeah, I'll concede that 8GB would show a lesser performance. But most people are not gaming 4k or ultrawide. No one doing 4k or Ultrawide gaming is going to buy a 5060/ti card so that point is irrelevant as well.
 
I can see that. The problem with that is there are only a few(2) examples where the differences are meaningful. They are otherwise not relevant to 1080p or 1440p gaming. 2160p or the ultra wide resolutions, ok yeah, I'll concede that 8GB would show a lesser performance. But most people are not gaming 4k or ultrawide.
Yeah, in the above examples, it's running 1440p ultimate, which I think is going to push past the limits of 8GB quite often on a modern title. They market these as 1080P ultimate cards, which they should easily manage. The shame of it is, if the GPU is capable of running 1440P but it being held back by a lack of RAM, then it kinda feels like the 8GB variant is artificially limited. That leads me back to my original comment, where I think the 8GB version is just there to hit the sub-400 price point so NVIDIA can say they did something, but the 16GB card is there for the upsell.

Considering how things have been going so far, I suspect the 8GB cards will be very hard to find, especially near MSRP. AIBs will focus on 16GB RGB OC models with oversized coolers and sell them for 5070 prices. My cursory look at MicroCenter's SKUs supports that notion. They have 16 5060Ti SKUs, and only TWO of those are 8GB models. The highest price 5060 Ti is exactly 1 dollar less than the cheapest 5070. And it doesn't matter, as everything is sold out already. The 5060 Ti's were gone in less than half an hour.
 
keeping Nvidia in check is an important role of journalists
I’m not sure Nvidia needs to be “kept in check”, nor is it a journalist‘s job to do so.

Finally, Linus is far from being a journalist.
 
Back
Top