• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4060 Ti to Feature Shorter PCB, 220 Watt TDP, and 16-Pin 12VHPWR Power Connector

The specs seem worse than they actually are (because of L2 cache and very high clocks)…
So, let's wait for actual benchmarks of finalized products then. :)

And VRAM capacity is an actual problem. 8 GB was already problematic on the 3070.
Why?
RTX 4060/4060 Ti is mostly going to be a 1080p card, or 1440p at medium details.

We've had this discussion with every generation; people make up subjective anecdotes about how much VRAM a card actually needs, when benchmarks show these cards still scale pretty well at even 4K resolutions. I wouldn't worry about 8 GB on a such card, both because I know how graphics actually works, but most importantly because benchmarks show they still run out of memory bandwidth and computational power long before VRAM under normal (realistic) use cases. Sure, you can load custom ridiculous texture packs which eats VRAM, but that's an unbalanced and unrealistic edge case, which shouldn't be the basis for buying recommendations.

And please don't bring up the future-proofing argument, that argument is BS and it hasn't held true in the past.

660 -> 960 - 37% performance increase with a 15% price decrease (I am ignoring the 760 because it was a refresh)
960 -> 1060 - 100% performance increase with a 25% price increase (there was also a slightly slower model with 3 GB at just $200, same as 960)
1060 -> 2060 - 56% performance increase with a 40% price increase
2060 -> 3060 - 21% performance increase with a 6% price decrease
3060 -> 4060 - ~45% performance increase with a 21% price increase (if leaks are true)
If you take into account the inflation levels of the product's lifespan, then it doesn't look that bad at all. Most years it's ~1.5-2.5%, but as we know the past couple of years combined is ~15% in the official numbers (~25%+ if we look at real inflation after the old models). So using the same assumptions as you, 3060 -> 4060, ~45% extra performance at roughly the same real price, then why complain?
(Not trying to spawn a political discussion here, just trying to look at things from a real world perspective)

It seems pretty clear why the 1060 was the number 1 card on Steam for almost 6 years.
GTX 1060 was undoubtedly one of the greatest card deals of all times (possibly the greatest seller too?), and its dominance has several contributing factors;
- The card was good (obviously), and despite the critics the cheap 3 GB version was plenty for most buyers in this segment.
- The supplies were great most of the time, could often be found below MSRP, especially the 3 GB version.
- The competition, RX 480/580, was in really short supply.
- The successor, Turing, was late (~2.5 years), and supplies weren't great initially.

I've built three systems with this card myself, and recommended it many times. The Pascal generation has held up pretty great overall too.

And it is beyond my comprehension why they do not want to achieve the same success with any card after that. Corporations' obsession with margins instead of overall profits is incredible (I experienced that myself working for one).
The solution is more competition. AMD (and Intel) needs to produce good cards in the mid-range and have good availability in all markets to really push prices down. Otherwise the current trend will continue.
 
I wonder what adaptor will be included, no point do the same 4-way tentacle like as with 4090/80.
 
I wonder what adaptor will be included, no point do the same 4-way tentacle like as with 4090/80.

I actually do kind of like the fact they are forcing this new connector. The adapter is problematic right now, but we will get to a point where all new PSUs support this connector and we will be able to power any card using just one cable. I just wish they designed it a little bit better, because it is supposedly hard to seed in the connector perfectly even with native cables.

I never liked any of the ATX connectors. The 20/24-pin is annoying, the 6+2-pin is horrible (connecting 3 of these to my 3080 was a nightmare, I even tore off a bit of skin on the top of my finger). I never get that click sound and I have to use quite a bit of force to really push them in. They should absolutely just slide in and make a click noise so that you know they are properly seeded.
The SATA power connector is probably the only one I never had issues with. Even molex was annoying and the pins would sometimes slide out of it.
 
The solution is more competition. AMD (and Intel) needs to produce good cards in the mid-range and have good availability in all markets to really push prices down. Otherwise the current trend will continue.
Currently, AMD has better options in the mid-range. Both the 6600 and 6700 series are better deals than the competition right now. We'll see if it holds up with the 7600 and 7700.

I actually do kind of like the fact they are forcing this new connector. The adapter is problematic right now, but we will get to a point where all new PSUs support this connector and we will be able to power any card using just one cable. I just wish they designed it a little bit better, because it is supposedly hard to seed in the connector perfectly even with native cables.

I never liked any of the ATX connectors. The 20/24-pin is annoying, the 6+2-pin is horrible (connecting 3 of these to my 3080 was a nightmare, I even tore off a bit of skin on the top of my finger). I never get that click sound and I have to use quite a bit of force to really push them in. They should absolutely just slide in and make a click noise so that you know they are properly seeded.
The SATA power connector is probably the only one I never had issues with. Even molex was annoying and the pins would sometimes slide out of it.
I agree with the 24-pin. It's just old and clunky and should be replaced with something more convenient.

Apart from that, my only issue is with the SATA cable which doesn't click into 2.5" HDDs for some reason, making installation into a desktop PC a bit of a pain.
 
Didn't many people say the same thing about the 20 and the 30 series too? It seems like there is a lot of complaining ahead of every Nvidia launch these days.

But I don't believe Nvidia will launch "bad" cards in the RTX 4060 / 4060 Ti segments.
Cards went from 11 to 10 to 8
When they go backwards each generation, it's a problem

Why? Because VRAM needs go down with DLSS, Nvidia are maxing out their profits making cards that only work right with their supported titles
 
I never liked any of the ATX connectors. The 20/24-pin is annoying, the 6+2-pin is horrible (connecting 3 of these to my 3080 was a nightmare, I even tore off a bit of skin on the top of my finger). I never get that click sound and I have to use quite a bit of force to really push them in. They should absolutely just slide in and make a click noise so that you know they are properly seeded.
The SATA power connector is probably the only one I never had issues with. Even molex was annoying and the pins would sometimes slide out of it.
The 24-pin is certainly difficult, but your issues with the 8-pin has to do with cheaply made parts. How can you expect more from parts which cost a few cent a piece?
And what makes you think 12VHPWR plug/socket with much lower tolerances is going to be better?

On the plus side, the 8-pin is very robust. I don't yet see improvements to the 12VHPWR standard to compete with this.

Currently, AMD has better options in the mid-range. Both the 6600 and 6700 series are better deals than the competition right now. We'll see if it holds up with the 7600 and 7700.
It depends on what you are comparing.
At MSRP, 6600 XT falls roughly midway between RTX 3060 and RTX 3060 Ti performance wise and price wise. Similarly, 6700 XT falls nicely between RTX 3060 Ti and RTX 3070. Then there is the non-XT and 6650 XT models scattered in between there. So at least within a few percent, they seem to be fairly comparable on paper.
If you base your assumption on Newegg, then I agree, the 6600/6700 series seems to be better deals there.
In my region, the prices are more in line with MSRP, but the Radeon cards are often not in stock, or at least most of them are not. And I have followed this pretty closely, as I'm planning to get a couple of cards for software testing.

Cards went from 11 to 10 to 8
When they go backwards each generation, it's a problem
Which cards? I assume you are referring to older high-end cards?

GTX 1060 3/6 GB -> RTX 2060 6 GB -> RTX 3060 12 GB* -> RTX 4060 8 GB (presumably)
Considering current cards up to RTX 3070 Ti runs and scales fine with just 8 GB, so it's fairly likely that RTX 4060 and RTX 4060 Ti will do so too.

*) RTX 3060 has way more VRAM than most other cards in its series.

Why? Because VRAM needs go down with DLSS, Nvidia are maxing out their profits making cards that only work right with their supported titles
This doesn't make much sense.
If DLSS actually worked (upscaled as well as rendering in higher resolutions), it would undermine the entire incentive of buying a high-end card. It's a gimmick to get people to upgrade or choose their products over the competition, and many people fall for it.

It doesn't seem like you understand how VRAM actually works.
GPUs use memory very differently than CPUs, they access large blocks and aren't as sensitive to latency, but are more reliant on throughput. For this reason, they can employ memory compression, which saves on memory storage, bandwidth and delays. Large buffers and some textures are highly compressible (such as stencil buffers, Z buffers, normal maps, displacement maps, etc.). Clever use of tiled rendering may improve VRAM utilization further, as temporary buffers could be cleared and compressed down to almost nothing (which is why a GPU can allocate more VRAM than it actually has, without swapping). VRAM compression has improved on every GPU generation, so 8 GB on Ada is much more useful than e.g. 8 GB on Pascal. Making anecdotes about how much VRAM a card actually needs is pointless. The truth is in the results, and since e.g. RTX 3070 Ti still scales fine in 4K in most games, this is strong evidence that 8 GB is fine.
 
It depends on what you are comparing.
At MSRP, 6600 XT falls roughly midway between RTX 3060 and RTX 3060 Ti performance wise and price wise. Similarly, 6700 XT falls nicely between RTX 3060 Ti and RTX 3070. Then there is the non-XT and 6650 XT models scattered in between there. So at least within a few percent, they seem to be fairly comparable on paper.
If you base your assumption on Newegg, then I agree, the 6600/6700 series seems to be better deals there.
In my region, the prices are more in line with MSRP, but the Radeon cards are often not in stock, or at least most of them are not. And I have followed this pretty closely, as I'm planning to get a couple of cards for software testing.
I always compare UK prices, either through Scan UK (which I most commonly use), or Overclockers UK. Maybe I should put this into my signature. :D

Anyways, at Scan UK, the cheapest 6650 XT is the MSi Mech 2X for £299.99, while the cheapest 3060 in stock is the Asus Phoenix discounted from £383.99 to £359.99 in the Christmas sale.

The cheapest 6700 XT is the Asus Dual for £430.99, while the cheapest 3070 is the MSi Ventus 3X, again discounted from £569.99 to £549.98 in the Christmas sale.

Now, anyone tell me that the Radeon isn't the better offer in these comparisons, and the "green tax" doesn't exist.
 
I always compare UK prices, either through Scan UK (which I most commonly use), or Overclockers UK. Maybe I should put this into my signature. :D

Anyways, at Scan UK, the cheapest 6650 XT is the MSi Mech 2X for £299.99, while the cheapest 3060 in stock is the Asus Phoenix discounted from £383.99 to £359.99 in the Christmas sale.

The cheapest 6700 XT is the Asus Dual for £430.99, while the cheapest 3070 is the MSi Ventus 3X, again discounted from £569.99 to £549.98 in the Christmas sale.

Now, anyone tell me that the Radeon isn't the better offer in these comparisons, and the "green tax" doesn't exist.
It's even worst the US

The non xt variant of the 6700 (6700 10gb) is currently selling for only 20-30 more dollars than a rtx 3050 which is embarrassing (3050 is like gtx 1070 level performance where 6700 non xt is like 2080/2080super level).

3060 base model sells for what the 6700xt goes for which once again is pretty bad
 
Back
Top