• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Price Cuts Bring the GeForce RTX 4060 Ti to Within $15 of Radeon RX 7600 XT

These cards are for 1080p, and for that, 8GB is just enough today.
You've said it exactly.

"Just enough for today" and "only 1080p". These aren't cheap GPUs that are disposable to be replaced in a year, they're expected to last for at least 2-3 years at a minimum.

1440p is rapidly becoming the mainstream resolution, especially since upscaling to 1080p from anything lower looks like garbage, whilst upscaling from 1080p to 1440p is pretty decent. That matters because monitors are cheap and GPUs aren't - with many new games using upscaling and needing it for acceptable performance at max settings.

Today's games are developed with 8GB as the recommended spec, since their development process started when XB1 and PS4 support were still required. Most of the UE5 stuff now is now squeezed down to fit into 8GB by the devs and several of them have openly criticised the wasted effort it takes them to compromise and tweak their assets to run on 8GB hardware. Going forward, games in development right now, so due later this year or next, will likely be targeting 12GB or 16GB for max settings now that the XB1 and PS4 are officially deprecated by the console vendors.
 
I agree, AMD made some bad moves with the 6500 and 6400 cards. Plus there is a lot of nonsense about 16GB being "too much VRAM". I was using 10GB+ with three browsers and zero games open ~18 months ago. It's not just about what video games use.
Good point, it's been a long time since I had to deal with VRAM struggles (RTX 2060 6GB) but if you're cutting it close to the bone with your game settings, it's definitely worth disabling GPU acceleration in your browsers and any background apps like Spotify and Discord that can use it too.

Reviewers like HUB and DF looking at VRAM capacity issues are testing on sterile OSes in their test platforms to minimise interference from background apps and processes and show only the performance of the game and hardware being benchmarked.

In the real world, nobody uses an offline, sterilised OS for one game at a time with nothing else running.
 
I agree, AMD made some bad moves with the 6500 and 6400 cards. Plus there is a lot of nonsense about 16GB being "too much VRAM". I was using 10GB+ with three browsers and zero games open ~18 months ago. It's not just about what video games use.


Sniper Elite 5 maxed out without crashing, Diablo IV maxed out every 30-90 seconds and would crash if you clicked the error message window summer 2023. Those are just the ones off of the top of my head. There are a lot more games than what are talked about in video card review videos. And yes, 12GB cards are entry level because 16GB cards are lower-midrange at best.


Diable 4 crash even on 7900XTX

It has to do with the poor optimization and memory leak that game have
 
Its about time 60-class got in on the action... 70-class perhaps next? 4070 TIS was never a solution, just a means to keep the perception of high value alive.

Someone recently posted up (on TPU) with one major EU retailers Q4 2023 sales numbers (etc), where AMDs 7000-series unit sales were outpacing Nvidias 40-series. Damn good indicator for more and faster price cuts from the Nvidia camp, which is well overdue.

I hope both Nvidia and AMD take this one to the ring without gloves - we want BLOOD! Bloody better value propositions from both camps. Although for sometime now, i'm more of the opinion the 2 player market is a disciplined one with mutually contrived collusion at play? .,.. you know soft gloves with the illusion of throwing wild punches to keep those profits JUICY!

Both the RTX 4060 Ti and the RTX 4060 appear to be designed to withstand a great degree of price cuts, to compete against the RX 7600 XT and RX 7600.

Hehe

Designed to withstand? And all this time i thought they were ripping us off, didn't realise it was a "feature" lol
 
4060 Ti uses 188 mm² die. Personally I would wait for 4070 at 399 for the 12GB because too many games are allocating above 10 GB already even at low settings. No idea about the difference between allocating and actually needing, yeah I don't want stuttering.
Allocated Vram is not the same as used Vram.
Used Vram is what game realy needs.
 
With these price cuts the 4070 prices need to come down as well. $425 to $450 would be nice.
Would be nice, yes - but there's no incentive for Nvidia to do so - the midrange hasn't been shaken up yet and the only 4060Ti card getting discounts is the 8GB runt model that's ageing like milk left out in the sun. The 16GB model which is at least fast enough to drive 1440p and use >8GB is holding its price of $450 or so without problems.

Sadly, the 7800XT is selling well at $510 or so and it's a direct competitor to the 4070 that's anchoring it to that price point.
 
12GB cards are entry level, 8GB cards are just wasted silicon; I keep maxing out the 16GB on my card. Also, getting tired of people suggesting that low end cards are going to pull off ray-tracing as they don't produce enough FPS to begin with to justify turning it on.
its allocating Vram not used Vram.
games run just fine even 8GB Vram Gpus

Flight Simulator.

View attachment 333268

View attachment 333266




Yes, because 8GB is already not enough in many games and settings.
12 GB is the minimum minimum if you are seriously into gaming.
Games allocating Vram, more u have more it will allocate it.
But 8GB cards will work just fine because real Vram usage is lower.

Hardware Unboxed and DigitalFoundry have both covered 8GB VRAM limitations across multiple articles from early 2023 onwards when the first crop of current-gen exclusive* console ports arrived on the scene and struggled mightily on PCs with less than 12GB VRAM. Those games have been cleaned up and patched somewhat since the terrible launches but Hogwarts, Jedi Survivor, Plague Tale Requiem, The Last of Us, Wild Hearts, to name just a few that have 8GB cards as a minimum and typically can't run max details unless you have 10+ GB.

* ie, not dumbed-down to also run on PS4 and XB1
Those all run max details using 8GB vram Gpus.
 
Those all run max details using 8GB vram Gpus.
They do now, yes. I've covered this already - At launch these games were largely unplayable at max settings on 8GB cards. The devs went back and wasted time and effort trying to shuffle assets and data streaming around to patch and make these 8GB cards viable over the course of a few months. That was dev time spent on trying to accommodate a shortage of VRAM that could have been spent on more content, DLC, or actual gameplay fixes.

This isn't my opinion, btw - these are official statements from developers like Naughty Dog, Respawn Entertainment, Avalanche Software covered in interviews, podcasts, developer blogs and picked up by the countless content creators across Youtube, Twitter, Twitch etc and spread around the web in just about every format that will cover those games and their progress.
 
Those all run max details using 8GB vram Gpus.
Not really, not at great frame rates anyway. Depending on the game, turn some settings down at 1080p, and yeah they'll play well, but not at max setting. And 1440p? Forget about it. Turn some settings down. The benchmarks right here on TPU clearly show this.
 
They do now, yes. I've covered this already - At launch these games were largely unplayable at max settings on 8GB cards. The devs went back and wasted time and effort trying to shuffle assets and data streaming around to patch and make these 8GB cards viable over the course of a few months. That was dev time spent on trying to accommodate a shortage of VRAM that could have been spent on more content, DLC, or actual gameplay fixes.

This isn't my opinion, btw - these are official statements from developers like Naughty Dog, Respawn Entertainment, Avalanche Software covered in interviews, podcasts, developer blogs and picked up by the countless content creators across Youtube, Twitter, Twitch etc and spread around the web in just about every format that will cover those games and their progress.
Just shows that people making the port were lazy lookin for a quick flip. the memory was enough clearly but they didn't want to spend the time to optimize until PR of how bad game ran made them do it. They hoped to be as little effort in to it as possible. 4060 8gb is all it needs as once you get close to that 8gb usage you are running outta gpu anyway.
 
My Gigabyte 3060 Eagle OC 2.0 12GB is really shining for me right now. Clean Win 11 23H2 install, latest NVidia drivers always all the time, and i install using nvcleanstall, never had a problem once. no crashes no freezes, no stuttering, i also enable 4g bit addressing with the bar enabled for full mamory range access. i also recommend forcing two thing in nvidia control panel, 16AF and high quality texture filtering, that will greatly improve the actual AF in games even when you select 16 AF in game, they are two different layers. Try looking straight down a runway and compare the fine lines in the vanishing points. you'll see. also if you have a gsync monitor, you want to let it do all the work dynamically like its supposed to, but to do that you have to select vsync OFF in all your games to allow the monitor to have access to all the frames, thereby allowing it to decide the best ones to lay down. I have many other tips for gaming performance if you want more just ask! here are my specs

AMD Ryzen 5 5600X 6-Core Processor @ 3.70 GHz
32 GB Mushkin Enhanced 3600 XMP cl 14
Gigabyte x570 Aorus Pro Wifi rev 1.1
Gigabyte 3060 Eagle OC 2.0 12GB LHR dont care about mining not even one... bit :p
WD blACk 850x 2TB 7.5 GB/s nvme
850x Corsair PSU
240hz lg ultragear gsync vesa hdr 400 certified
P series silent case
no overcloking no need if you have a stable efficient setup and the correct settings
also turn off PBO, it does very little in my opinion but reduce the life of your processor. there are already boosters in place.
 
Last edited:
Diable 4 crash even on 7900XTX

It has to do with the poor optimization and memory leak that game have
D4 was a mess in every way, VRAM hogging wouldn't surprise me. But the game was fundamentally broken in every way and a 1% boost to a miscellaneous skill is just one example of how underwhelming and disappointing that game was.
 
12gb being entry level... And here I am with an RX 560D 4GB. Well... And my Steam Deck OLED. Though I'll admit I don't have much time (or money) for these newer and more demanding games.

Perhaps someday a modder will figure a way to get something like SLI or other multi gpu tech to pool memory together. 8 + 8 being seen like a single 16gb card. Would certainly help move sales of more lower vram cards. I know my old voodoo2s worked that way. In fact, sli unlocked higher resolution!
 
Back
Top