• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4060 Ti to Feature Shorter PCB, 220 Watt TDP, and 16-Pin 12VHPWR Power Connector

I wonder why Nvidia is so determined to use the new power connector on every single card. I bet the 30 W GT 4010 will have it too (unless there won't be a 4010 because Nvidia can't be asked to make a card that doesn't cost your liver and a kidney anymore).
 
I wonder why Nvidia is so determined to use the new power connector on every single card. I bet the 30 W GT 4010 will have it too (unless there won't be a 4010 because Nvidia can't be asked to make a card that doesn't cost your liver and a kidney anymore).
This is pretty easy to understand, they're trying to make it a standard connector. It's the same reason people used SATA connectors for optical disc drives even though they offered no real advantage over PATA. And the same reason Seagate recently introduced NVMe equipped HDDs.
 
The new connector is really a stupid idea, if it worked out so badly with the more tech savvy crowd, i can only imagine how bad it will go with cheaper models and less tech inclined buyers. Stupid decision. But then again there's pricing and no of it really matters. This things won't sell, they can't, or humanity is lost to idiocy.
 
The new connector is really a stupid idea, if it worked out so badly with the more tech savvy crowd, i can only imagine how bad it will go with cheaper models and less tech inclined buyers. Stupid decision. But then again there's pricing and no of it really matters. This things won't sell, they can't, or humanity is lost to idiocy.
True, although the lower power levels of the more mainstream oriented cards are less likely to be a problem in terms of the connector.
 
Nvidia was not kidding when it said it wanted to be like Apple and that includes forcing unnecessary adapters on customers.
 
Those specs sound pretty bad considering the 3060ti had 4864 shader, 8gig DDR6, 256bit bus and 448 GB/s bandwidth. I would of thought 10 gig of memory and at least same BW.
These specs claim 128bit bus with 288 GB/s BW.
Seems Nvidia is shifting the whole product stack down, which is bad for price they are asking now.

Yes, people keep paying more for less, and they keep asking more for lower-tier cards and people happily fork out the cash.

This is what happens when people don't listen to those that call out their marketing bs. If we are extremely lucky, we might see normalized prices again two generations from now. Otherwise, I suggest, you start taking up another hobby.
 
Why are you hung up in specs? Isn't real world performance what matters regardless of how it's achieved?
If someone released a fast GPU with 128-bit memory bus with a very high speed, wouldn't you consider it?
Or if someone released a quad core CPU which kicked the a** of 8-cores, wouldn't you buy it?

We are likely to see faster GDDR6X/7 soon, perhaps even with this generation.
I should have been most specific.
For my needs (mainstream gaming, not afraid to tone down some settings), cards having 128 bit memory buses typically didn't cut it. I tend not to have high hopes for them, but I will buy depending on what the benchmarks say, not the sticker on the box.
 
High end GPU's like GTX1080 used to have a power consumption of 180W, now even mainstream models are more power hungry.
Mine overclocked hit 220W
Now the 60 series is at that wattage, which is see as backwards progression


I'm gunna have to start paying attention to perfomance per watt as the true sign of a series of GPU's being an upgrade or not - it's like the geforce FX days all over again (high wattages, screaming fans, products worse than their previous gen counterparts)
 
Mine overclocked hit 220W
Now the 60 series is at that wattage, which is see as backwards progression


I'm gunna have to start paying attention to perfomance per watt as the true sign of a series of GPU's being an upgrade or not - it's like the geforce FX days all over again (high wattages, screaming fans, products worse than their previous gen counterparts)
The two triple-P's (performance per power and performance per price) are the true indicators of progress, imo.

Frames don't mean anything on their own. One can always add more cores, and bam... behold a faster GPU!
 
Poor evil nvidia :kookoo:

RTX 3060 Ti - 4864 shaders, 8 GB, 200 watts
RTX 4060 Ti - 4352 shaders, 8 GB, 220 watts

So, 4060 Ti will be slower and yet more power hungry? :kookoo:

The 4060 ti is on TSMC node, so a good chunk higher frequency so it will be faster anyway, probably 15-20% faster.
 
Poor evil nvidia :kookoo:

RTX 3060 Ti - 4864 shaders, 8 GB, 200 watts
RTX 4060 Ti - 4352 shaders, 8 GB, 220 watts

So, 4060 Ti will be slower and yet more power hungry? :kookoo:
Apply DLSS3.
Fix.

#sarcasm
 
I should have been most specific.
For my needs (mainstream gaming, not afraid to tone down some settings), cards having 128 bit memory buses typically didn't cut it. I tend not to have high hopes for them, but I will buy depending on what the benchmarks say, not the sticker on the box.
In the end, yeah I do that too. Thing is also, memory was always a juggle with regards to memory bus width. Its a cost and an energy budget aspect there too. If they can make more efficient (use of) memory, that's a win, in the end. As long as there are no painful sacrifices.

And that's Ampere's first line up, for sure in a nutshell. With Ada it seems memory is in a better place, but the bandwidth similarities between two cards with very different core power is an interesting one. How good is that cache and where will it fall short...
 
The new connector is really a stupid idea, if it worked out so badly with the more tech savvy crowd, i can only imagine how bad it will go with cheaper models and less tech inclined buyers. Stupid decision. But then again there's pricing and no of it really matters. This things won't sell, they can't, or humanity is lost to idiocy.
Also remember standards are usually used in workstations and server racks too. I'm sure those customers will be super enthusiastic for a bad standard nobody needed.

On the positive side, RTX 40 series cards will probably be very valuable to collectors 20 years from now… :rolleyes:
(sarcasm)

I hope at least some RTX 40 series AIB cards have standard 8-bit plugs.

And BTW, where did all the new PSUs with ATX 3.0 go? It's been awfully quiet for a while. I wonder if they holding them back, revising the standard, or something else is going on?
 
Also remember standards are usually used in workstations and server racks too. I'm sure those customers will be super enthusiastic for a bad standard nobody needed.
Actually adaptation is probably going to be faster in those segments, Intel has already shown that their Data Center GPU Max 1100 will use the 16pin connector in its PCIe form factor version, despite the fact that it's a 300W GPU.

Sooner or later even AMD will adopt it, unless they're expecting people to use four 8pin connectors in the RX 8900 XTX OC...
 
The two triple-P's (performance per power and performance per price) are the true indicators of progress, imo.

Frames don't mean anything on their own. One can always add more cores, and bam... behold a faster GPU!
price i care less about because i can always wait and buy year old or second hand hardware (as can anybody, simply wait for a sale)

No amount of waiting makes high wattage cards any more efficient, and historically the most loathed cards are the highest wattage ones (especially above 250W) - always remembered as hot, noisy and failing early (as the cooling weakened, general users rarely even replace thermal paste let alone thermal pads on VRM's and VRAM)
 
This generation is turning into a joke. 8 GB with a 128-bit bus? For how much, $500? And this is supposed to last 2 years with truly next-gen games not existing yet?

This is what we should be seeing at these super high prices:
4080 Ti - cut down AD102, 320-bit, 20 GB
4080 - full AD103, 256-bit, 16 GB (G6X)
4070 + Ti - cut down AD103, 256-bit, 16 GB (G6 non-X)
4060 + Ti - AD104, 192-bit, 12 GB (G6 for 4060, G6X for Ti)
4050 - AD106, 128-bit, 8 GB

While the L2 cache might help with the narrow memory buses, the VRAM capacity will be a huge problem in the coming years, especially with ray tracing.

People should really buy the RTX 30 series and the RX 6000 series while they are still available at good prices. Ignore this new garbage and wait 2 years for the next generation.
 
8GB is a joke, unbalanced specs. Fast card, too little vram.
 
People should really buy the RTX 30 series and the RX 6000 series while they are still available at good prices. Ignore this new garbage and wait 2 years for the next generation.
Didn't many people say the same thing about the 20 and the 30 series too? It seems like there is a lot of complaining ahead of every Nvidia launch these days.

But I don't believe Nvidia will launch "bad" cards in the RTX 4060 / 4060 Ti segments.
 
Didn't many people say the same thing about the 20 and the 30 series too? It seems like there is a lot of complaining ahead of every Nvidia launch these days.

The 20 series was expensive, but it introduced RT and tensor cores. Those chips were huge, but raster performance was not that much better than Pascal.

I do not remember people complaining about the 30 series, at least the 3080 and 3070. I mean I pre-ordered the 3080 on day 1. For $700 it gave me 80% more performance (100% in RT) over my 2070 Super that I paid $500 for a year before. It is one of NVIDIA's best cards ever, incredible value.
The lower tier cards were not as impressive, though. But the bigger problem was that you could not buy them at MSRP. And now that you can, Radeon cards are sooo much cheaper.

If you look at the GTX 1060, you can realize how badly NVIDIA is screwing over the mid-range segment these days. That was 980 performance with more VRAM for just $250.
Even the 2060 offered 1080 performance on average at $350. But the 3060 only offered 2070 (non-Super) performance at "$330". And the 4060 is supposed to offer 3070 performance at $400 according to leaks. Not only is the price going up (which can be understandable), but the performance gains are going down (which is not). It should be one or the other.
 
Last edited:
The 20 series was expensive, but it introduced RT and tensor cores. Those chips were huge, but raster performance was not that much better than Pascal.

I do not remember people complaining about the 30 series, at least the 3080 and 3070. I mean I pre-ordered the 3080 on day 1. For $700 it gave me 80% more performance (100% in RT) over my 2070 Super that I paid $500 for a year before. It is one of NVIDIA's best cards ever, incredible value.
The lower tier cards were not as impressive, though. But the bigger problem was that you could not buy them at MSRP. And now that you can, Radeon cards are sooo much cheaper.

If you look at the GTX 1060, you can realize how badly NVIDIA is screwing over the mid-range segment these days. That was 980 performance with more VRAM for just $250.
Even the 2060 offered 1080 performance on average at $350. But the 3060 only offered 2070 (non-Super) performance at "$330". And the 4060 is supposed to offer 3070 performance at $400 according to leaks. Not only is the price going up (which can be understandable), but the performance gains are going down (which is not). It should be one or the other.
The 960 and 1060 are legendary cards, imo. They represent everything Nvidia should be. They're high-tech, innovative, energy-conscious mid-range cards with no over-the-edge requirement for cooling. "The way it's meant to be played" as they say. A lot of people complained about Turing, but I had no issue with it - it was all about introducing ray tracing to an architecture that is otherwise similar to Pascal. It was with Ampere that all went downhill. First with blaming covid, then the chip shortage (yeah, right...), crypto miners, scalpers, etc. And now with Ada, Nvidia is doing the scalping themselves. They seem to think that mining is still a thing and they can do whatever they want, and as long as sales justify it, they will.
 
The 960 and 1060 are legendary cards, imo. They represent everything Nvidia should be…
GTX 960 was great, but GTX 970 was an even greater deal. Despite all the misconceptions and FUD about the slower 0.5 GB VRAM, it was really a card which performed very close to GTX 980, thanks to striking a better resource balance achieving higher performance per GFLOP than the rest of the series.

While I agree that prices these days are too high, this mostly applies to the high-end models. The mid-range models, albeit being a little pricey, is nowhere that bad. (This is also where there is some competition)
But my bigger point to what THU31, xorbe and many others said; they basically are hung up on specs. It's very common that people prejudge products this way based on specs, even though none of us knows the final spec or performance level. And I'm pretty sure many of people like this will change their tune once the products arrive, realizing they are "better than expected", etc. I've followed this too long not to notice this pattern.

So to restate what I said in #70, I don't think Nvidia will launch "bad" cards in this segment, they can afford to do that. These cards will make up a good share of their revenue.
 
I wonder why Nvidia is so determined to use the new power connector on every single card. I bet the 30 W GT 4010 will have it too (unless there won't be a 4010 because Nvidia can't be asked to make a card that doesn't cost your liver and a kidney anymore).
Force us into it like the intel 12vo deal
 
The specs seem worse than they actually are (because of L2 cache and very high clocks), but the point is that we are getting smaller performance gains while prices are increasing. It would be very different if the 4060 was supposed to offer 3080 performance at $400, instead of 3070 performance.
And VRAM capacity is an actual problem. 8 GB was already problematic on the 3070.

660 -> 960 - 37% performance increase with a 15% price decrease (I am ignoring the 760 because it was a refresh)
960 -> 1060 - 100% performance increase with a 25% price increase (there was also a slightly slower model with 3 GB at just $200, same as 960)
1060 -> 2060 - 56% performance increase with a 40% price increase
2060 -> 3060 - 21% performance increase with a 6% price decrease
3060 -> 4060 - ~45% performance increase with a 21% price increase (if leaks are true)

It seems pretty clear why the 1060 was the number 1 card on Steam for almost 6 years. And it is beyond my comprehension why they do not want to achieve the same success with any card after that. Corporations' obsession with margins instead of overall profits is incredible (I experienced that myself working for one).
 
Last edited:
Back
Top