Wednesday, December 14th 2022

NVIDIA GeForce RTX 4060 Ti to Feature Shorter PCB, 220 Watt TDP, and 16-Pin 12VHPWR Power Connector

While NVIDIA has launched high-end GeForce RTX 4090 and RTX 4080 GPUs from its Ada Lovelace family, middle and lower-end products are brewing to satisfy the entire consumer market. Today, according to the kopite7kimi, a well-known leaker, we have potential information about the configuration of the upcoming GeForce RTX 4060 Ti graphics card. Featuring 4352 FP32 CUDA cores, the GPU is powered by an AD106-350-A1 die. On the die, there is 32 MB of L2 cache. To pair, it has 8 GB of GDDR6 18 Gbps memory, which should be enough to power games at 1440p resolution, which this card is aiming for.

The design of the cards reference PG190 PCB is supposedly very short, making it ideal for ITX-sized designs we could see from NVIDIA's AIB partners. Interestingly, with a TDP of 220 Watts, the reference card is powered by the infamous 16-pin 12VHPWR connector, capable of supplying 600 Watts of power. This choice of connector is unclear; however, it could be NVIDIA's push to standardize its usage across all products in the Ada Lovelace family stack. While the card should not need the full potential of the connector, it signals that the company could only be using this type of connector for all of its future designs.
Source: @kopite7kimi (Twitter)
Add your own comment

82 Comments on NVIDIA GeForce RTX 4060 Ti to Feature Shorter PCB, 220 Watt TDP, and 16-Pin 12VHPWR Power Connector

#51
RedBear
AusWolfI wonder why Nvidia is so determined to use the new power connector on every single card. I bet the 30 W GT 4010 will have it too (unless there won't be a 4010 because Nvidia can't be asked to make a card that doesn't cost your liver and a kidney anymore).
This is pretty easy to understand, they're trying to make it a standard connector. It's the same reason people used SATA connectors for optical disc drives even though they offered no real advantage over PATA. And the same reason Seagate recently introduced NVMe equipped HDDs.
Posted on Reply
#52
Bomby569
The new connector is really a stupid idea, if it worked out so badly with the more tech savvy crowd, i can only imagine how bad it will go with cheaper models and less tech inclined buyers. Stupid decision. But then again there's pricing and no of it really matters. This things won't sell, they can't, or humanity is lost to idiocy.
Posted on Reply
#53
shovenose
Bomby569The new connector is really a stupid idea, if it worked out so badly with the more tech savvy crowd, i can only imagine how bad it will go with cheaper models and less tech inclined buyers. Stupid decision. But then again there's pricing and no of it really matters. This things won't sell, they can't, or humanity is lost to idiocy.
True, although the lower power levels of the more mainstream oriented cards are less likely to be a problem in terms of the connector.
Posted on Reply
#54
evernessince
Nvidia was not kidding when it said it wanted to be like Apple and that includes forcing unnecessary adapters on customers.
Posted on Reply
#55
Legacy-ZA
Ed_1Those specs sound pretty bad considering the 3060ti had 4864 shader, 8gig DDR6, 256bit bus and 448 GB/s bandwidth. I would of thought 10 gig of memory and at least same BW.
These specs claim 128bit bus with 288 GB/s BW.
Seems Nvidia is shifting the whole product stack down, which is bad for price they are asking now.
Yes, people keep paying more for less, and they keep asking more for lower-tier cards and people happily fork out the cash.

This is what happens when people don't listen to those that call out their marketing bs. If we are extremely lucky, we might see normalized prices again two generations from now. Otherwise, I suggest, you start taking up another hobby.
Posted on Reply
#56
bug
efikkanWhy are you hung up in specs? Isn't real world performance what matters regardless of how it's achieved?
If someone released a fast GPU with 128-bit memory bus with a very high speed, wouldn't you consider it?
Or if someone released a quad core CPU which kicked the a** of 8-cores, wouldn't you buy it?

We are likely to see faster GDDR6X/7 soon, perhaps even with this generation.
I should have been most specific.
For my needs (mainstream gaming, not afraid to tone down some settings), cards having 128 bit memory buses typically didn't cut it. I tend not to have high hopes for them, but I will buy depending on what the benchmarks say, not the sticker on the box.
Posted on Reply
#57
Mussels
Freshwater Moderator
krimetalHigh end GPU's like GTX1080 used to have a power consumption of 180W, now even mainstream models are more power hungry.
Mine overclocked hit 220W
Now the 60 series is at that wattage, which is see as backwards progression


I'm gunna have to start paying attention to perfomance per watt as the true sign of a series of GPU's being an upgrade or not - it's like the geforce FX days all over again (high wattages, screaming fans, products worse than their previous gen counterparts)
Posted on Reply
#58
AusWolf
MusselsMine overclocked hit 220W
Now the 60 series is at that wattage, which is see as backwards progression


I'm gunna have to start paying attention to perfomance per watt as the true sign of a series of GPU's being an upgrade or not - it's like the geforce FX days all over again (high wattages, screaming fans, products worse than their previous gen counterparts)
The two triple-P's (performance per power and performance per price) are the true indicators of progress, imo.

Frames don't mean anything on their own. One can always add more cores, and bam... behold a faster GPU!
Posted on Reply
#59
Gungar
ARFPoor evil nvidia :kookoo:

RTX 3060 Ti - 4864 shaders, 8 GB, 200 watts
RTX 4060 Ti - 4352 shaders, 8 GB, 220 watts

So, 4060 Ti will be slower and yet more power hungry? :kookoo:
The 4060 ti is on TSMC node, so a good chunk higher frequency so it will be faster anyway, probably 15-20% faster.
Posted on Reply
#60
Dirt Chip
ARFPoor evil nvidia :kookoo:

RTX 3060 Ti - 4864 shaders, 8 GB, 200 watts
RTX 4060 Ti - 4352 shaders, 8 GB, 220 watts

So, 4060 Ti will be slower and yet more power hungry? :kookoo:
Apply DLSS3.
Fix.

#sarcasm
Posted on Reply
#61
ixi
CrackongLet me guess

$500 MSRP and $600 actual AIB price
Real price will be around 700 or 800 :D
Posted on Reply
#62
Vayra86
bugI should have been most specific.
For my needs (mainstream gaming, not afraid to tone down some settings), cards having 128 bit memory buses typically didn't cut it. I tend not to have high hopes for them, but I will buy depending on what the benchmarks say, not the sticker on the box.
In the end, yeah I do that too. Thing is also, memory was always a juggle with regards to memory bus width. Its a cost and an energy budget aspect there too. If they can make more efficient (use of) memory, that's a win, in the end. As long as there are no painful sacrifices.

And that's Ampere's first line up, for sure in a nutshell. With Ada it seems memory is in a better place, but the bandwidth similarities between two cards with very different core power is an interesting one. How good is that cache and where will it fall short...
Posted on Reply
#63
efikkan
Bomby569The new connector is really a stupid idea, if it worked out so badly with the more tech savvy crowd, i can only imagine how bad it will go with cheaper models and less tech inclined buyers. Stupid decision. But then again there's pricing and no of it really matters. This things won't sell, they can't, or humanity is lost to idiocy.
Also remember standards are usually used in workstations and server racks too. I'm sure those customers will be super enthusiastic for a bad standard nobody needed.

On the positive side, RTX 40 series cards will probably be very valuable to collectors 20 years from now… :rolleyes:
(sarcasm)

I hope at least some RTX 40 series AIB cards have standard 8-bit plugs.

And BTW, where did all the new PSUs with ATX 3.0 go? It's been awfully quiet for a while. I wonder if they holding them back, revising the standard, or something else is going on?
Posted on Reply
#64
RedBear
efikkanAlso remember standards are usually used in workstations and server racks too. I'm sure those customers will be super enthusiastic for a bad standard nobody needed.
Actually adaptation is probably going to be faster in those segments, Intel has already shown that their Data Center GPU Max 1100 will use the 16pin connector in its PCIe form factor version, despite the fact that it's a 300W GPU.

Sooner or later even AMD will adopt it, unless they're expecting people to use four 8pin connectors in the RX 8900 XTX OC...
Posted on Reply
#65
Mussels
Freshwater Moderator
AusWolfThe two triple-P's (performance per power and performance per price) are the true indicators of progress, imo.

Frames don't mean anything on their own. One can always add more cores, and bam... behold a faster GPU!
price i care less about because i can always wait and buy year old or second hand hardware (as can anybody, simply wait for a sale)

No amount of waiting makes high wattage cards any more efficient, and historically the most loathed cards are the highest wattage ones (especially above 250W) - always remembered as hot, noisy and failing early (as the cooling weakened, general users rarely even replace thermal paste let alone thermal pads on VRM's and VRAM)
Posted on Reply
#66
thewan
TheLostSwedeGDDR6X is pricey... :rolleyes:
rolling ur eyes doesnt remove your responsibility to read. please read before you reply.
Posted on Reply
#67
THU31
This generation is turning into a joke. 8 GB with a 128-bit bus? For how much, $500? And this is supposed to last 2 years with truly next-gen games not existing yet?

This is what we should be seeing at these super high prices:
4080 Ti - cut down AD102, 320-bit, 20 GB
4080 - full AD103, 256-bit, 16 GB (G6X)
4070 + Ti - cut down AD103, 256-bit, 16 GB (G6 non-X)
4060 + Ti - AD104, 192-bit, 12 GB (G6 for 4060, G6X for Ti)
4050 - AD106, 128-bit, 8 GB

While the L2 cache might help with the narrow memory buses, the VRAM capacity will be a huge problem in the coming years, especially with ray tracing.

People should really buy the RTX 30 series and the RX 6000 series while they are still available at good prices. Ignore this new garbage and wait 2 years for the next generation.
Posted on Reply
#68
xorbe
8GB is a joke, unbalanced specs. Fast card, too little vram.
Posted on Reply
#69
efikkan
THU31People should really buy the RTX 30 series and the RX 6000 series while they are still available at good prices. Ignore this new garbage and wait 2 years for the next generation.
Didn't many people say the same thing about the 20 and the 30 series too? It seems like there is a lot of complaining ahead of every Nvidia launch these days.

But I don't believe Nvidia will launch "bad" cards in the RTX 4060 / 4060 Ti segments.
Posted on Reply
#70
THU31
efikkanDidn't many people say the same thing about the 20 and the 30 series too? It seems like there is a lot of complaining ahead of every Nvidia launch these days.
The 20 series was expensive, but it introduced RT and tensor cores. Those chips were huge, but raster performance was not that much better than Pascal.

I do not remember people complaining about the 30 series, at least the 3080 and 3070. I mean I pre-ordered the 3080 on day 1. For $700 it gave me 80% more performance (100% in RT) over my 2070 Super that I paid $500 for a year before. It is one of NVIDIA's best cards ever, incredible value.
The lower tier cards were not as impressive, though. But the bigger problem was that you could not buy them at MSRP. And now that you can, Radeon cards are sooo much cheaper.

If you look at the GTX 1060, you can realize how badly NVIDIA is screwing over the mid-range segment these days. That was 980 performance with more VRAM for just $250.
Even the 2060 offered 1080 performance on average at $350. But the 3060 only offered 2070 (non-Super) performance at "$330". And the 4060 is supposed to offer 3070 performance at $400 according to leaks. Not only is the price going up (which can be understandable), but the performance gains are going down (which is not). It should be one or the other.
Posted on Reply
#71
AusWolf
THU31The 20 series was expensive, but it introduced RT and tensor cores. Those chips were huge, but raster performance was not that much better than Pascal.

I do not remember people complaining about the 30 series, at least the 3080 and 3070. I mean I pre-ordered the 3080 on day 1. For $700 it gave me 80% more performance (100% in RT) over my 2070 Super that I paid $500 for a year before. It is one of NVIDIA's best cards ever, incredible value.
The lower tier cards were not as impressive, though. But the bigger problem was that you could not buy them at MSRP. And now that you can, Radeon cards are sooo much cheaper.

If you look at the GTX 1060, you can realize how badly NVIDIA is screwing over the mid-range segment these days. That was 980 performance with more VRAM for just $250.
Even the 2060 offered 1080 performance on average at $350. But the 3060 only offered 2070 (non-Super) performance at "$330". And the 4060 is supposed to offer 3070 performance at $400 according to leaks. Not only is the price going up (which can be understandable), but the performance gains are going down (which is not). It should be one or the other.
The 960 and 1060 are legendary cards, imo. They represent everything Nvidia should be. They're high-tech, innovative, energy-conscious mid-range cards with no over-the-edge requirement for cooling. "The way it's meant to be played" as they say. A lot of people complained about Turing, but I had no issue with it - it was all about introducing ray tracing to an architecture that is otherwise similar to Pascal. It was with Ampere that all went downhill. First with blaming covid, then the chip shortage (yeah, right...), crypto miners, scalpers, etc. And now with Ada, Nvidia is doing the scalping themselves. They seem to think that mining is still a thing and they can do whatever they want, and as long as sales justify it, they will.
Posted on Reply
#72
efikkan
AusWolfThe 960 and 1060 are legendary cards, imo. They represent everything Nvidia should be…
GTX 960 was great, but GTX 970 was an even greater deal. Despite all the misconceptions and FUD about the slower 0.5 GB VRAM, it was really a card which performed very close to GTX 980, thanks to striking a better resource balance achieving higher performance per GFLOP than the rest of the series.

While I agree that prices these days are too high, this mostly applies to the high-end models. The mid-range models, albeit being a little pricey, is nowhere that bad. (This is also where there is some competition)
But my bigger point to what THU31, xorbe and many others said; they basically are hung up on specs. It's very common that people prejudge products this way based on specs, even though none of us knows the final spec or performance level. And I'm pretty sure many of people like this will change their tune once the products arrive, realizing they are "better than expected", etc. I've followed this too long not to notice this pattern.

So to restate what I said in #70, I don't think Nvidia will launch "bad" cards in this segment, they can afford to do that. These cards will make up a good share of their revenue.
Posted on Reply
#73
eidairaman1
The Exiled Airman
AusWolfI wonder why Nvidia is so determined to use the new power connector on every single card. I bet the 30 W GT 4010 will have it too (unless there won't be a 4010 because Nvidia can't be asked to make a card that doesn't cost your liver and a kidney anymore).
Force us into it like the intel 12vo deal
Posted on Reply
#74
THU31
The specs seem worse than they actually are (because of L2 cache and very high clocks), but the point is that we are getting smaller performance gains while prices are increasing. It would be very different if the 4060 was supposed to offer 3080 performance at $400, instead of 3070 performance.
And VRAM capacity is an actual problem. 8 GB was already problematic on the 3070.

660 -> 960 - 37% performance increase with a 15% price decrease (I am ignoring the 760 because it was a refresh)
960 -> 1060 - 100% performance increase with a 25% price increase (there was also a slightly slower model with 3 GB at just $200, same as 960)
1060 -> 2060 - 56% performance increase with a 40% price increase
2060 -> 3060 - 21% performance increase with a 6% price decrease
3060 -> 4060 - ~45% performance increase with a 21% price increase (if leaks are true)

It seems pretty clear why the 1060 was the number 1 card on Steam for almost 6 years. And it is beyond my comprehension why they do not want to achieve the same success with any card after that. Corporations' obsession with margins instead of overall profits is incredible (I experienced that myself working for one).
Posted on Reply
#75
efikkan
THU31The specs seem worse than they actually are (because of L2 cache and very high clocks)…
So, let's wait for actual benchmarks of finalized products then. :)
THU31And VRAM capacity is an actual problem. 8 GB was already problematic on the 3070.
Why?
RTX 4060/4060 Ti is mostly going to be a 1080p card, or 1440p at medium details.

We've had this discussion with every generation; people make up subjective anecdotes about how much VRAM a card actually needs, when benchmarks show these cards still scale pretty well at even 4K resolutions. I wouldn't worry about 8 GB on a such card, both because I know how graphics actually works, but most importantly because benchmarks show they still run out of memory bandwidth and computational power long before VRAM under normal (realistic) use cases. Sure, you can load custom ridiculous texture packs which eats VRAM, but that's an unbalanced and unrealistic edge case, which shouldn't be the basis for buying recommendations.

And please don't bring up the future-proofing argument, that argument is BS and it hasn't held true in the past.
THU31660 -> 960 - 37% performance increase with a 15% price decrease (I am ignoring the 760 because it was a refresh)
960 -> 1060 - 100% performance increase with a 25% price increase (there was also a slightly slower model with 3 GB at just $200, same as 960)
1060 -> 2060 - 56% performance increase with a 40% price increase
2060 -> 3060 - 21% performance increase with a 6% price decrease
3060 -> 4060 - ~45% performance increase with a 21% price increase (if leaks are true)
If you take into account the inflation levels of the product's lifespan, then it doesn't look that bad at all. Most years it's ~1.5-2.5%, but as we know the past couple of years combined is ~15% in the official numbers (~25%+ if we look at real inflation after the old models). So using the same assumptions as you, 3060 -> 4060, ~45% extra performance at roughly the same real price, then why complain?
(Not trying to spawn a political discussion here, just trying to look at things from a real world perspective)
THU31It seems pretty clear why the 1060 was the number 1 card on Steam for almost 6 years.
GTX 1060 was undoubtedly one of the greatest card deals of all times (possibly the greatest seller too?), and its dominance has several contributing factors;
- The card was good (obviously), and despite the critics the cheap 3 GB version was plenty for most buyers in this segment.
- The supplies were great most of the time, could often be found below MSRP, especially the 3 GB version.
- The competition, RX 480/580, was in really short supply.
- The successor, Turing, was late (~2.5 years), and supplies weren't great initially.

I've built three systems with this card myself, and recommended it many times. The Pascal generation has held up pretty great overall too.
THU31And it is beyond my comprehension why they do not want to achieve the same success with any card after that. Corporations' obsession with margins instead of overall profits is incredible (I experienced that myself working for one).
The solution is more competition. AMD (and Intel) needs to produce good cards in the mid-range and have good availability in all markets to really push prices down. Otherwise the current trend will continue.
Posted on Reply
Add your own comment
Jun 1st, 2024 03:33 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts