• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

16GB Variant of GeForce RTX 4060 Ti Launches July 18

lol, i'm just waiting for the NV and AMD to conclude that the dGPU market is dying rather than realizing their sales are tanking because they pushed crappy products at inflated prices. Not that the dGPU mkt isn't in decline; there are a whole lot of consoles out there these days. Price, convenience, and they (relatively, compared to the chaos of pc configs) mostly just work.

I suspect the lackluster sales of this crappy gen of dGPUs is only gonna accelerate that transition. Idk.

I find this a bit regrettable as the higher res/higher perf and more complex controls of pc suit me; games designed to be played with six buttons* are very diff from ones w/ kybd&mouse (or joystick or...etc).

*whatever the actual number is. I don't own one, and there are a couple brands out there selling well.
 
The issue isn't us not understanding the numbers. People who visit sites like TechPowerUp are the top 1% of knowledge/caring about PCs and their components. The point is that the numbers are dishonest, by design, to trick consumers. Mislead them. You know, lying.

"Oh but you should just read the actual numbers" isn't a valid excuse for a massive company to mislead the public. But we all know people are going to rush to the defense of a multi-billion dollar corporation. Who cares if it's hurting consumers and ultimately you, right? The precious corporations must be protected and defended online, for free.
Trick customers how? Mislead them into what?
 
I'll have to admit, Arc is looking pretty good, save for its power (in)efficiency. Battlemage can't come soon enough.

I think the hardware specs of ARC are impressive, but software support is more lacking relative to competition. At least that's my impression looking at it. Still I suspect ARC would be a strong GPU for reshade within it's price rang given the hardware specs are more robust.
 
That card might be popular among video editors on a budget, just like the 3060 12GB before it. The 12 GB 3060 beats or ties the 3060Ti in every single video test for slightly lower price.
  • Relatively low price (compared to 4070 and 4080) - check;
  • Same Nv video codec (as 4070 and 4080) - check;
  • AI and RTX cores used in video editing - check;
  • More VRAM than the 4070 - check.
For gaming, I'd say the 4070 would be a much better investment, or if you don't bother with useless RT - AMD GPU.
 
100$ for 8GB VRAM ? :oops:
Apple charges $200 for 8 GB standard DDR RAM. :)

Why should i pay €200+ for a NVidia graphices card with 16gb when i can have an Intel Arc 770 with also 16gb? My desktop doesn't take care about a certain one. It will be displayed on both same good and same fast.

Because:
- Arc only has DirectX 12 by hardware, the lower DirectX versions are emulated.
- Intel Drivers suck.
 
So it can choke on an additional 8GB of VRAM? WTF.
Yea This is a FFFFFFFFFFFFFFk moment. All they had to do is to open the bus up a bit more and they would be it... but Noooooo! lets slap Moar Memory (at effing reduced cost from the foundries) and call it a day.

GOD... I hate Silicon Valley.
 
They'd need to use AD103 for that, or cut down the memory bus from 192 bit to 128 bit which will hurt performance.

Ironically that is what RTX 5070 is going to be, 128 bit is here to stay and climb the latter into the 70-tier, a 104-die 3N shrink of the 4080 and since it is 70% denser it will barely be able to fit 128 bit bus same fate as the 4060 Ti that is somewhat of a 2070 with dual issue 2304 and 32M L2$. But the transistor count is that of a 2080 Ti, so that is better suited for comparison and packs 4608 CUDA as well.

Well forget it this gen is a flop. Nvidia will not yield no matter what. And even if they do it's too late.
 
Last edited:
If these had 21Gbp/s or 23Gbp/s memory then they'd probably sell and peform a fair bit better. Seems like a real missed opportunity.
Awaiting a good sale price on this or an overclocked model or maybe a 4060 Super with GDDR6X.. doubt that happen though.

A 60 series card should not be 8GB for starters in 2023, Nvidia really doing it's best to make bank with this gens 60 series and people aren't having any of it.

I'd be more interested in a Intel ARC A770 32GB Special Edition with 360 AIO.

Battlemage is where it's at.. Not far off.
 
Last edited:
Because:
- Arc only has DirectX 12 by hardware, the lower DirectX versions are emulated.
- Intel Drivers suck.
1. Does that matter when working on the machine using Linux? Both displays shows the desktop. Two or three times a year i use a 3D scanner. Otherwise LibreCalc, Excel and Lazarus/Gambas. More rare i recode a video to place it on Youtube.

2. Intel drivers might suck when playing games on Windows. but i don't. NVidia drivers sucks on Linux. NVidia or Intel with linux is like choosing between plague and cholera. ;) On linux it is best to choose AMD. But try to find a more powerful two slot card. No matter if AMD, Intel or NVidia.
 
1. Does that matter when working on the machine using Linux? Both displays shows the desktop. Two or three times a year i use a 3D scanner. Otherwise LibreCalc, Excel and Lazarus/Gambas. More rare i recode a video to place it on Youtube.

2. Intel drivers might suck when playing games on Windows. but i don't. NVidia drivers sucks on Linux. NVidia or Intel with linux is like choosing between plague and cholera. ;) On linux it is best to choose AMD. But try to find a more powerful two slot card. No matter if AMD, Intel or NVidia.
I can't comment on Linux because I don't use Linux, but on windows AMD and specially Intel suck.
 
16GB that is nice. But I have settled already for next gen. Waiting for amd/intel/nvidia next big things.
 
I can't comment on Linux because I don't use Linux, but on windows AMD and specially Intel suck.
When you don't use Linux how you are able to know that Intel drivers sucks on Linux? I don't care about windows. I own only one Windows Notebook for my 3D scanner. The rest of my systems are running Linux.
 
There's numbers that matter and numbers that don't.

Actual gaming numbers and price matter.
Efficiency matters for some. Bus width, VRAM size, manufacturing process only matter for very specific needs.
Numbers on the box don't matter at all. I used to have a 6600GT, then I had a GTX 260, now I have a 1060... I haven't bought a single one because of the model number or the codename of the silicon die.
If you ignore the name of the card, the 4060 Ti 16GB is going to perform about the same as other products from AMD and Nvidia at $350. There's literally a 50% 40-series tax.
 
If you ignore the name of the card, the 4060 Ti 16GB is going to perform about the same as other products from AMD and Nvidia at $350. There's literally a 50% 40-series tax.
I wouldn't judge the "tax" for the whole series based on one SKU, but yeah. You get DLSS3 and better efficiency for that "tax", but that's it...
Fwiw, I might still pull the trigger on a 4060Ti 8GB, but I will definitely not pay $500 (+tax) for the 16GB version.
 
I think the 16GB model does not target the gamers but rather users who use the GPU for other stuff too.
Either work or hobby or gaming but definitely a mixed use.
 
I think the 16GB model does not target the gamers but rather users who use the GPU for other stuff too.
Either work or hobby or gaming but definitely a mixed use.
Issue with the 4060Ti for that is its lack of memory bandwidth.
 
When you don't use Linux how you are able to know that Intel drivers sucks on Linux? I don't care about windows. I own only one Windows Notebook for my 3D scanner. The rest of my systems are running Linux.
That's what I'm saying, I don't know Linux because I don't use Linux, but on WINDOWS, AMD and Intel drivers suck.
 
Issue with the 4060Ti for that is its lack of memory bandwidth.
Exactly. It's often far worse in productivity workloads than the older, cheaper, 3060Ti because it doesn't have the bandwidth to stretch it's computational advantage.
 
That's what I'm saying, I don't know Linux because I don't use Linux, but on WINDOWS, AMD and Intel drivers suck.
I don't think that you will have problems with any driver showing your desktop. No matter which OS at all. And exactly about that i wrote about. And I defined that clearly already in my first post. How usefull might your reaction be with the provided facts in mind? Did you read my posts? Or did you just answer allergic after the word "Intel" and "Arc"?
 
I don't think that you will have problems with any driver showing your desktop. No matter which OS at all. And exactly about that i wrote about. And I defined that clearly already in my first post. How usefull might your reaction be with the provided facts in mind? Did you read my posts? Or did you just answer allergic after the word "Intel" and "Arc"?
Yes I do, with AMD and Intel, I can't set 10 bit color for my smart TV, only with Nvidia Cards.
 
Yes I do, with AMD and Intel, I can't set 10 bit color for my smart TV, only with Nvidia Cards.
I think your TV or HDMI cable are garbage, or there's some PEBKAC occurring; AMD have made plenty of mistakes but driver support for colour depth isn't one of them:

1688766988473.png


I can get 10 or 12-bit colour on my 6700XT, 6800XT, and Steam Deck. Across hundreds of machines at the office and homes, I've never once had issues that couldn't be attributed to shit displays or damaged cables.

I wouldn't judge the "tax" for the whole series based on one SKU, but yeah. You get DLSS3 and better efficiency for that "tax", but that's it...
Fwiw, I might still pull the trigger on a 4060Ti 8GB, but I will definitely not pay $500 (+tax) for the 16GB version
What's a 6800XT cost in your region?
Raster performance of a 6800XT is vastly superior to the 4060Ti, the RT performance is similar at worst, much better otherwise.

I'm not trying to discourage you from buying Nividia, but unless you need CUDA and DLSS frame gen it's hugely overpriced compared to AMD and also Ampere. You'll struggle to buy a 3080 in many regions (including the UK) but the 3080 is equivalent in value to a 6800XT on the used/refurb market.
 
I think your TV or HDMI cable are garbage, or there's some PEBKAC occurring; AMD have made plenty of mistakes but driver support for colour depth isn't one of them:

View attachment 303900

I can get 10 or 12-bit colour on my 6700XT, 6800XT, and Steam Deck. Across hundreds of machines at the office and homes, I've never once had issues that couldn't be attributed to shit displays or damaged cables.
I tried with different cables, even high end 8K cables, both HDMI and DisplayPort to HDMI, none work.

With my old 1050 TI had no issues with 10 bit on my Smart TV.
 
Yes I do, with AMD and Intel, I can't set 10 bit color for my smart TV, only with Nvidia Cards.
Is there a visible difference between 8 or 10 bit at the normal desktop whilst editing a worksheet in LibreCalc/Excel or at a IDE for programming like Gambas, Lazarus or Visual studio? I use 2 32" monitors with a native resolution of 2560x1440.My daily used sheet has around 2500 lines of code at Librecalc. I want to see at one screen the makro-IDE and at the other the original worksheet when developing needed Makros. When developing software i want to have the debugged code at one monitor at the debugged pro9gram on the other. Is it visible when slicing a 3D object for a 3D Printer?

It's much harder to find a fast graphics card that has a height of a maximum of 2 slots, has a RAM of at least 16GB and can output to 2 display ports. I want to have one more HDMI and one Display port output as an additional free option to use. My new rig has a case with a glass side. I install the Graphics card with a riser card. The 2 slots are needed as the case do not support higher cards. If possible the gfx card also should have a decent look with possibly no eyecatching blingbling. Have fun to search for them.
 
Is there a visible difference between 8 or 10 bit at the normal desktop whilst editing a worksheet in LibreCalc/Excel or at a IDE for programming like Gambas, Lazarus or Visual studio? I use 2 32" monitors with a native resolution of 2560x1440.My daily used sheet has around 2500 lines of code at Librecalc. I want to see at one screen the makro-IDE and at the other the original worksheet when developing needed Makros. When developing software i want to have the debugged code at one monitor at the debugged pro9gram on the other. Is it visible when slicing a 3D object for a 3D Printer?

It's much harder to find a fast graphics card that has a height of a maximum of 2 slots, has a RAM of at least 16GB and can output to 2 display ports. I want to have one more HDMI and one Display port output as an additional free option to use. My new rig has a case with a glass side. I install the Graphics card with a riser card. The 2 slots are needed as the case do not support higher cards. If possible the gfx card also should have a decent look with possibly no eyecatching blingbling. Have fun to search for them.
If you need a high end video card with just 2 slots, then get a water cooled one, those ones are super thin.
 
If you need a high end video card with just 2 slots, then get a water cooled one, those ones are super thin.
Did i say that i need a high end card? To show a desktop? It would be possible to get a 4070TI with 2 slots only. But i don't want to. It's a waste of time. In former times i always used Matrox cards. That Intel bla bla bla card is enough to show my desktop without any issue. And so we are again at my first post with my question: why should i pay more? At Intel i get a good card to show a desktop. Additionally i can recode Videos for i.e. YT with that Intel card quite fast. And at all it has device drivers for Linux Mint. More i do not need.
 
Back
Top