• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

4060TI Questions?

I said decade old memory configs... not cards.
8GB GPUs aren't a "decade old" config, especially not in the mid-range.

My point is that the industry is moving on, and 12gb is becoming the new standard. If a game can run on a 8GB at 1080p or 1440p then thats great. I'm just saying, those days are coming to an end. We knew this was going to happen 2 years ago and the time is upon us.

Idk how many times we can blame it on 'optimization' when the reality is, these games are primary made for consoles with 16GB gddr unified memory, with I believe 10.5GB available on the xbox for graphics and technically no limitation on the ps5... but it does need to store other kinds of data and what not, so its around 12gb available for graphics.
Xbox Series S has 10GB total, so there goes your argument.

Its been done in the past but developers have straight up said their new targets are 12GB for high settings.
[citation needed]
 
Pretty interesting. Not a total dud then after all... at 300-350 bucks that is. 8GB definitely a limiter, at least situationally. Also interesting to see the 16GB is allowed to pull a good 15W more in those situations.

8GB GPUs aren't a "decade old" config, especially not in the mid-range.


Xbox Series S has 10GB total, so there goes your argument.


[citation needed]
Not a decade, but we did have a 8GB x70 since 2016 and consoles with similar amount just the same. And frankly 7 years really is the expiration date for any GPU unless its top end and well tended.

The series S is by now well known as a problematic machine in recent games especially because of its gap in specs with the X. MS recently 'unlocked' some extra MBs for devs precisely to give them some more headroom for optimisation and it still isnt a true fix.

Lets not be blind here... everything points at the same issue even if its minor today.
 
Last edited:
Pretty interesting. Not a total dud then after all... at 300-350 bucks that is. 8GB definitely a limiter, at least situationally. Also interesting to see the 16GB is allowed to pull a good 15W more in those situations.


Not a decade, but we did have a 8GB x70 since 2016 and consoles with similar amount just the same. And frankly 7 years really is the expiration date for any GPU unless its top end and well tended.

The series S is by now well known as a problematic machine in recent games especially because of its gap in specs with the X. MS recently 'unlocked' some extra MBs for devs precisely to give them some more headroom for optimisation and it still isnt a true fix.

Lets not be blind here... everything points at the same issue even if its minor today.
I think the peak power draw is only 5 W more for 4 additional memory devices which is pretty good for GDDR6. That 19 W higher power draw in Spider-Man Remastered is due to the GPU working harder as it now has sufficient RAM. An interesting test would have been to measure power consumption in particularly VRAM limited games like The Last of Us part I (99th percentile fps increased by 32%).
 
Not a decade, but we did have a 8GB x70 since 2016 and consoles with similar amount just the same. And frankly 7 years really is the expiration date for any GPU unless its top end and well tended.

The series S is by now well known as a problematic machine in recent games especially because of its gap in specs with the X. MS recently 'unlocked' some extra MBs for devs precisely to give them some more headroom for optimisation and it still isnt a true fix.

Lets not be blind here... everything points at the same issue even if its minor today.
Sure, but I'm tired of so-called tech-savvy people blaming GPU manufacturers for everything while giving game devs a free pass. It used to be that the two would meet in the middle, now it seems the latter expects their games to transfer over to PC with zero effort on their part, and that's not an attitude we should be rewarding.

Honestly, if either NVIDIA or AMD were smart they'd do a respin of their midrange GPUs to bump the bus width to 192-bit and enable 12GB models. But they're not smart, just greedy, so here we sit.
 
Sure, but I'm tired of so-called tech-savvy people blaming GPU manufacturers for everything while giving game devs a free pass. It used to be that the two would meet in the middle, now it seems the latter expects their games to transfer over to PC with zero effort on their part, and that's not an attitude we should be rewarding.

Honestly, if either NVIDIA or AMD were smart they'd do a respin of their midrange GPUs to bump the bus width to 192-bit and enable 12GB models. But they're not smart, just greedy, so here we sit.
I agree and I think both are true, to a degree.
 
8GB GPUs aren't a "decade old" config, especially not in the mid-range.

Slight exaggeration.

Xbox Series S has 10GB total, so there goes your argument.

I had the series x in mind. But sure if you only want 1080p at low-med settings going forward you could get by on 8gb, I even said that in my post.

Or if you dont mind low quality textures on new games, thats fine too. I still use a computer with 32mb of vram for old games and emulators. But if we're talking new AAA games at high settings ( even 1080p in some cases) 8gb aint cutting it anymore. Not without a lot of extra work, that developers aren't going to keep doing forever.

[citation needed]

I've seen it said a few times but here's one example I can remember:


Start at around 54:27 if you just want to specifically hear that line.

Then there's the games that have 12gb, 16gb gpus in their recommended specs, like this one comes to mind for instance.


I agree and I think both are true, to a degree.
I agree, both are true. Hoping once direct storage makes its way into more games the process will become smoother.
 
I agree, both are true. Hoping once direct storage makes its way into more games the process will become smoother.
Hopes and dreams, or you can just buy a 6800 and be safe. If its not working and widely used today you can wait 3+ years for it to be no problem. Thats a moment where youre looking at new GPUs already.
 
Hopes and dreams, or you can just buy a 6800 and be safe. If its not working and widely used today you can wait 3+ years for it to be no problem. Thats a moment where youre looking at new GPUs already.
Hey we can all use a little bit of hope!

I have a 4090 so I'm not immediately worried for myself. A bit worried about the pc gaming industry as a whole maybe, I know its doing fine right now but... hoping too many people don't ditch pc gaming cause of high prices ( though I suppose there's always the steamdeck) and hoping developers don't stop porting games cause there's too many people with insufficient specs expecting to play at max settings and complaining ( though I suppose as long as they pay, doesn't matter so much if they complain, in the eyes of the corporation).

But I really do think directstorage will make things easier. Its been confirmed for Ratchet and Clank so... thats something!
 
Back
Top