Tuesday, July 11th 2023

No Official Review Program for NVIDIA GeForce RTX 4060 Ti 16 GB Cards

NVIDIA is reported to be taking a hands off approach prior to the launch of its GeForce RTX 4060 Ti 16 GB GPU next week—rumored to take place on July 18. Murmurs from last week posited that add-in card (AIC) partners were not all that confident in the variant's prospects, with very little promotional activity lined up. NVIDIA itself is not releasing a Founders Edition GeForce RTX 4060 Ti 16 GB model, so it will be relying on board partners to get custom design units sent out to press outlets/reviewers. According to Hardware Unboxed, as posted on Twitter earlier today, no hardware will be distributed to the media: "Now there's no official review program for this model, there will be no FE version and it seems that NVIDIA and their partners really don't want to know about it. Every NVIDIA partner I've spoken to so far has said they won't be providing review samples, and they're not even sure when their model will be available."

Their announcement continued: "So I don't know when you'll be able to view our review, but I will be buying one as soon as I can. I expect coverage will be pretty thin and that's probably the plan, the release strategy here is similar to that of the RTX 3080 12 GB." TPU can confirm that test samples have not been sent out by NVIDIA's board partners, so a retail unit will be purchased (out of pocket) for reviewing purposes. Previous reports have theorized that not many custom models will be available at launch, with the series MSRP of $499 not doing it many favors in terms of buyer interest. MSI has prepared a new white GAMING X design for the 16 GB variant, so it is good to see at least one example of an AIB putting the effort in...but it would be nice to get a press sample.
Sources: Hardware Unboxed Tweet, VideoCardz
Add your own comment

52 Comments on No Official Review Program for NVIDIA GeForce RTX 4060 Ti 16 GB Cards

#26
bug
KeivzI would expect less than 50 non reviewers worldwide will purchase this at msrp. It will be a fun product for review purchasers as a comparison piece of hardware but in no way is it worth the sticker price. Anyone in their right mind with a $500 gpu budget would be better off going with a 3070 or better yet a 6800 xt.
As far as performance, I think we can expect little to no improvement contrasted to the 8 gb variant—except in very few cases, which may in part be due to the limited bus width, but also because many of the games that led to the outcry over 8 gb of vram have been patched up fairly nicely. We shall see.
Tbh, everybody knows how these will perform. We just need a few trusted sources to confirm and put some real numbers next to those expectations.
Posted on Reply
#27
Vayra86
TheinsanegamerNIt IS a 4050ti, the 106 dies are almost never used as xx6x tier cards. These are usually cut down 104s.

The 4060ti should have been a 8gb 4050ti for $200-250 at most.
Wrong, the 106 is historically usually the x60. The 104 populates the x70 and (usually) x80. 102 for top end, and the 103 is new, slotting in between them.

Any gen it doesnt means Nvidia is shifting pricing and strategy. Case in point since Turing. Ever since then we lost sight of truly solid generations of GPU. Now with Ada they positioned (=priced) everything below x90 so badly that nothing is worth paying for.

Pascal gave us a 3/4th 104 x70 with 8GB at a similar price point as this weaksauce, memory/bandwidth neutered 4060ti. Nuff said... skip this. Even the 16GB is beyond saving because it still lacks bandwidth. Every dollar spent above 300 is too much. The 8GB should have been 250, 299 for the god tier (lel) AIB treatment perhaps, to part fools with money.

(edited, mistakenly thought 1080 was 499, corrected)
Posted on Reply
#28
wheresmycar
So,

NVIDIA don't want it

Partners don't want it

Consumers don't want it

The 60-class performance tier doesn't want it

The $500 performance realm doesn't want it

The skimped bus/bandwidth doesn't want it

LIKE WE DIDN'T SEE THAT COMING! or GOING! or maybe ALREADY GONE!

I hope the disappointment stands at launch. I still can't get over the 4080 @ $1200. Everything else just slotted in tits-up (or down, depending on how you look at it). The worst part, i don't think it matters to nV. You'd expect them to learn their lesson and deliver next Gen on a flowered bed but nah they'll be too busy counting greens with the AI-boom to consider us gaming minions
Posted on Reply
#29
Crackong
bug8GB should be no more then $350 and the 16GB no more than $400
I think there should be just one SKU, Rename 4060Ti to 4060, 16GB and priced at 329

And the OG 4060 should rename to 4050.
Posted on Reply
#30
AusWolf
Why the lack of attention? I know it's very close to the 4070 in price, but that hasn't stopped Nvidia from hyping garbage before.
Posted on Reply
#31
AndroidBR
Is this card crippled by narrow bus and x8 PCIe lanes? If it is, it's an instant pass from me regardless of the price. This would work like crap on my no 2 PC with a B450 board because it's PCIe 3.0 x16. I wanted to give my son a new GPU to replace the aging 1070 Ti. I guess I'll have to wait and get a 4070 Ti second hand in a few years, or a 5060 16GB if nVidia gets their crap together in a few years, especially on pricing. These anemic crippled cards are simply too darn expensive. It's expensive junk.
Posted on Reply
#32
AusWolf
AndroidBRIs this card crippled by narrow bus and x8 PCIe lanes? If it is, it's an instant pass from me regardless of the price. This would work like crap on my no 2 PC with a B450 board because it's PCIe 3.0 x16. I wanted to give my son a new GPU to replace the aging 1070 Ti. I guess I'll have to wait and get a 4070 Ti second hand in a few years, or a 5060 16GB if nVidia gets their crap together in a few years, especially on pricing. These anemic crippled cards are simply too darn expensive. It's expensive junk.
You should barely lose 1-3% performance on PCI-e 3.0 x8, which isn't even noticeable without an FPS counter and performance logging. It's nothing to worry about.

If you insist on replacing the 1070 Ti, and you insist on using a full x16 slot, I'd suggest looking at a 6700 XT or 6750 XT.
Posted on Reply
#33
Psyclown
My god what a nightmare hahaha. The 40 series should have their own reality TV show at this point with all the drama and scandals.
Posted on Reply
#34
bug
PsyclownMy god what a nightmare hahaha. The 40 series should have their own reality TV show at this point with all the drama and scandals.
I don't think there's much drama. The way I see it, Nvidia was planning for taking prices to new highs. They released 4080 and 4090, saw that the market didn't respond well, cut the price a little on 4070 and 4070Ti. And then, to be able to release 4060 and 4060Ti under $500 they just had to seriously cut into the GPU which resulted in lower prices and the anemic GPUs that we have today. It's not like they had many options this late into their design/prod cycle. And the expected delay of "Ada next" into 2025 may also be a sign they're going back to the drawing board. Hopefully trying to design something they can price more reasonably. This last part could be just wishful thinking on my side, since they can sell a ton to AI operators/wannabes these days.
Posted on Reply
#35
KV2DERP
Who would buy a card that barely edge out 3060Ti with double the VRAM for $500?
Posted on Reply
#36
chrcoluk
I seen in the GN review missing textures were noted on the 8 gig version of this card, I would like to see on reviews any game that has VRAM issues to be marked as not tested due to not running properly rather than still presenting a score, as is the case if the game crashes.
Posted on Reply
#37
Vayra86
bugI don't think there's much drama. The way I see it, Nvidia was planning for taking prices to new highs. They released 4080 and 4090, saw that the market didn't respond well, cut the price a little on 4070 and 4070Ti. And then, to be able to release 4060 and 4060Ti under $500 they just had to seriously cut into the GPU which resulted in lower prices and the anemic GPUs that we have today. It's not like they had many options this late into their design/prod cycle. And the expected delay of "Ada next" into 2025 may also be a sign they're going back to the drawing board. Hopefully trying to design something they can price more reasonably. This last part could be just wishful thinking on my side, since they can sell a ton to AI operators/wannabes these days.
The x60(ti) was planned way earlier. You dont sersiously believe they adjusted the stack that late? They just painted themselves in a corner and it started with their pseudo 12gb '4080'. They never adjusted a single chip. They had to rename them. The whole Ada stack suffers from the positioning of the 4080 16GB. The gap is too large from x90 on down so this is what we're left with.

Not so much drama as overconfident marketing. They sell this stack on DLSS3. Its pretty worthless otherwise compared to Ampere equivalents.
Posted on Reply
#38
PapaTaipei
Should be jailed for producing eWaste
Posted on Reply
#39
TheoneandonlyMrK
Hands off, AHH that must be why no one's reviewing the card?!.

@W1zzard do you have any 4060 To 16 GB reviews to look into?.
Posted on Reply
#40
Assimilator
PapaTaipeiShould be jailed for producing eWaste
Not just producing waste, but wasting perfectly good memory chips, inductors, etc. that could have gone onto cards that consumers actually want to buy.
TheoneandonlyMrK@W1zzard do you have any 4060 To 16 GB reviews to look into?.
AFAIK W1zz doesn't buy cards to review out of his own pocket. And really there's no point in a review, the only thing it would show versus the 8GB model is possibly better frames at 4K when the card doesn't have to swap textures in and out so much due to the larger framebuffer... but that's academic anyway since this isn't a 4K card, even with DLSS.

If NVIDIA had built this card using a fully-enabled AD106 GPU, so that you get all 4608 cores versus the 4352 on the 4060 Ti 8GB, then its existence might be justified. Or maybe a variant of AD106 with a 192-bit bus and 12GB, which should be significantly faster than the plain 8GB model. Or both. But as is, 4060 Ti 8+8GB is just so very pointless.
Posted on Reply
#41
W1zzard
TheoneandonlyMrKdo you have any 4060 To 16 GB reviews to look into?.
So everyone I talked to is not sampling the card, not NVIDIA either. NVIDIA did confirm that there won't be a press driver before the launch, only the public WHQL.

Buying a card would be np, but I'm going on holiday on Thursday until Sunday, so we'd be kinda late with our review. Not sure if it's worth blowing that money on producing content that goes live after all the YouTubers have had their 30 minute drama videos about the card.
Posted on Reply
#42
TheoneandonlyMrK
W1zzardSo everyone I talked to is not sampling the card, not NVIDIA either. NVIDIA did confirm that there won't be a press driver before the launch, only the public WHQL.

Buying a card would be np, but I'm going on holiday on Thursday until Sunday, so we'd be kinda late with our review. Not sure if it's worth blowing that money on producing content that goes live after all the YouTubers have had their 30 minute drama videos about the card.
No worries, enjoy your holiday :) I was intrigued, but it doesn't matter.
Posted on Reply
#43
Vayra86
W1zzardSo everyone I talked to is not sampling the card, not NVIDIA either. NVIDIA did confirm that there won't be a press driver before the launch, only the public WHQL.

Buying a card would be np, but I'm going on holiday on Thursday until Sunday, so we'd be kinda late with our review. Not sure if it's worth blowing that money on producing content that goes live after all the YouTubers have had their 30 minute drama videos about the card.
Dont cover it. The x60 ada tier is something we'd best forget fast.

Let it rot! And enjoy your holiday, spend the money on more cocktails instead :)
Posted on Reply
#44
wheresmycar
I agree with varya86... im not even remotely interested in the seeing how the 4060-16 stacks up. Put the price down to $400 and it still doesn't spark any interest (well for me anyway)
Posted on Reply
#45
sLowEnd
wheresmycarI agree with varya86... im not even remotely interested in the seeing how the 4060-16 stacks up. Put the price down to $400 and it still doesn't spark any interest (well for me anyway)
idk, I like to see all sorts of hardware documented, good or bad. History would be so boring and incomplete if only good things were recorded.
Posted on Reply
#46
wheresmycar
sLowEndidk, I like to see all sorts of hardware documented, good or bad. History would be so boring and incomplete if only good things were recorded.
well there is that! Can't argue there.

Actually it would be useful data to discourage people from buying into it :nutkick:
Posted on Reply
#47
W1zzard
sLowEndidk, I like to see all sorts of hardware documented, good or bad. History would be so boring and incomplete if only good things were recorded.
Yeah that's how I feel, too. Also people who have no idea and think "16 > 8" need to be able to find a high-quality source online that explains more
Posted on Reply
#48
wNotyarD
AssimilatorIf NVIDIA had built this card using a fully-enabled AD106 GPU, so that you get all 4608 cores versus the 4352 on the 4060 Ti 8GB, then its existence might be justified. Or maybe a variant of AD106 with a 192-bit bus and 12GB, which should be significantly faster than the plain 8GB model. Or both. But as is, 4060 Ti 8+8GB is just so very pointless.
It is either that NVIDIA/TSMC can't push out fully working entire AD106 chips, or there's a 4060 Ti Super Leather Jacket Edition (with 8GB gddr6x over a 128 bus, possibly, and I don't know how much more vram can be put with such a bus) in the oven.
Posted on Reply
#49
PapaTaipei
wNotyarDIt is either that NVIDIA/TSMC can't push out fully working entire AD106 chips, or there's a 4060 Ti Super Leather Jacket Edition (with 8GB gddr6x over a 128 bus, possibly, and I don't know how much more vram can be put with such a bus) in the oven.
It's not an oven it's a feces fermentation device.
Posted on Reply
#50
TheoneandonlyMrK
wNotyarDIt is either that NVIDIA/TSMC can't push out fully working entire AD106 chips, or there's a 4060 Ti Super Leather Jacket Edition (with 8GB gddr6x over a 128 bus, possibly, and I don't know how much more vram can be put with such a bus) in the oven.
Read the title, 16 GB
Posted on Reply
Add your own comment
Jun 1st, 2024 01:52 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts