• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 4060 Ti 16GB Model Features 5W Higher TDP, Slightly Different ASIC Code

That is a strong argument, indeed. People will simply not play things they can't run. Time will tell who blinks first, consumer or industry push.

But consider for a moment that usually the very same people dó applaud the RT push, the new tech, etc. ;) They already blinked.

what will save them is gamers will buy them because they want to play them if the game is really good, and just play at low, until they can upgrade, like everyone ever did since PC gaming started. People play games on toasters and it never stopped no one.

The shitty games will not sell and blame the weather and stuff. In this case most AAA games this days.
 
His videos lately are just BS. Chooses settings to hog the Vram of the 3070,and then acts surprised it stutters like crazy. The actual question is, what is the image quality impact in those games if you drop textures to high instead of ultra? Not a lot I'd imagine, that's why he is not testing it. Won't generate as many clicks as just pooping on nvidia will.

Why isn't he for example testing amd vs nvidia on the new PT cyberpunk upgrade? I really wonder

Given price comparable cards can run the games at the chosen settings without stuttering or without having really ugly texture swapping it is an issue with the 3070. The fact you need to downgrade the IQ on the 3070 vs price comparable cards is not great.

This is only going to happen in more and more games as devs drop PS4 and Xbox One development and focus solely on PC, PS5 and Series X.
 
But consider for a moment that usually the very same people dó applaud the RT push, the new tech, etc. ;) They already blinked.

I'm not taking that personally, I came from a long line of top end NV cards, up to the awfully priced 2080ti (I had money come my way - so why not). But in that backstory, I bought a hardware gsync monitor (and it works well, especially in sub 60fps games like Control and CP2077). I'm tied into that eco-system to a degree, otherwise, I'd probably have bought a 7900XT, although same rules -- no more expensive than £800. Also, I've another rule for the hell of it which is I'll move up a tier when I can get 50% extra perf for the same power budget. The 4070ti gave that over my 2080ti. The 7900XT didn't quite. Plus, my card's ultra silent.

It's not always about blinking first.
 
I don't think nvidia cares much about whether or not you buy their new card. They are not putting much vram to prevent pros going for the mainstream models instead of the quadros that sell for much more.

Now with that said, someone could make the same argument about AMD, they are RT performance starved to force you to upgrade. The thing is, anyone with a 3080 or a 3090 will feel that the 7900xtx is a sidegrade in some areas, and a downgrade in others, which it is when it comes to losing DLSS and the RT performance.
If losing some features only in a handful of titles in exchange for 24-32% performance uplift is a "sidegrade"- that end user was never considering AMD anyway.
1684165208824.png
 
4070 TI is a shitshow and 4060TI will be even more so, due to even narrower memory bus, so why even bother with 16 gigs on a 128 bit bus dGPU?

I was misled by 4070TI's review benchmarks. I pulled a trigger on it to replace my 3090 due to favorable poweer consumption and oh boy what a mistake that has been. My primarly use of these GPUs is for VR gaming at around 3124 x 3056 px*2 resolution (HP reverb G2's 1SS resolution) and at this res 4070TI's narrow bus comes into play, causing really bad micro stutters with huge ms spikes, making it borderline unusable for VR. True, fps averages of 4070TI are slightly better than that of 3090, but 3090 totally kills it in terms of smoothness, even when undervolted/underclocked to 1750MHz/750mV to get consuption down to 250W. The whole ADA gen with exception of 4090 is an utter joke :banghead:
 
I'm not taking that personally, I came from a long line of top end NV cards, up to the awfully priced 2080ti (I had money come my way - so why not). But in that backstory, I bought a hardware gsync monitor (and it works well, especially in sub 60fps games like Control and CP2077). I'm tied into that eco-system to a degree, otherwise, I'd probably have bought a 7900XT, although same rules -- no more expensive than £800. Also, I've another rule for the hell of it which is I'll move up a tier when I can get 50% extra perf for the same power budget. The 4070ti gave that over my 2080ti. The 7900XT didn't quite. Plus, my card's ultra silent.

It's not always about blinking first.
I agree on the blinking. That 7900XT wasnt the 'optimal' choice either to me... just the most interesting really. Could have waited even longer... just didnt want to.
 
Considering that 80% or more of gamers don't have more then 8GB of vram i would say they are just dumb by doing so. But these are the same genius that can't release a finished game to save their lives so i guess it checks out.
More than half of all gamers are on current-gen consoles, where VRAM is 10GB+ and those 10GB+ are more efficiently allocated than a desktop GPU with Windows+driver overheads. So no, it's not 80% of gamers. It's the 20% of 80% of PC gamers who actually play AAA games on PC. As a percentage of the AAA gaming market, something approaching 90% of gamers have >8GB of RAM.

It's about choosing the relevant statistic for the issue at hand, and the issue is that developers make games for the largest demographic - which is console-first and PC second. Of those 80% of machines in the Steam Hardware Survey that have 8GB or less, only a tiny fraction of them will be playing the latest AAA titles.
 
4070 TI is a shitshow and 4060TI will be even more so, due to even narrower memory bus, so why even bother with 16 gigs on a 128 bit bus dGPU?

I was misled by 4070TI's review benchmarks. I pulled a trigger on it to replace my 3090 due to favorable poweer consumption and oh boy what a mistake that has been. My primarly use of these GPUs is for VR gaming at around 3124 x 3056 px*2 resolution (HP reverb G2's 1SS resolution) and at this res 4070TI's narrow bus comes into play, causing really bad micro stutters with huge ms spikes, making it borderline unusable for VR. True, fps averages of 4070TI are slightly better than that of 3090, but 3090 totally kills it in terms of smoothness, even when undervolted/underclocked to 1750MHz/750mV to get consuption down to 250W. The whole ADA gen with exception of 4090 is an utter joke :banghead:
Yeah I'm having that issue as well. My wife bought me a quest 2 and I really wanted to play flight simulator VR.

Quest 2 doesn't like the AMD encoder so I can't really go with them. Nvidia the 4070s are gimped and lots of people on the flight simulator forum where complaining of the stutter your experiencing.

Thanks to Nvidia artificially raising the price of the 4070 to $600-650 range I can't even really get a worthwhile price on a used 3080 (which I would need a worthwhile price if I'm going to gamble on a card that 50% chance was mined on).
 
I want 60 FPS in shooters. I will be alright with a little less in other genres.

Yeah, I consider 60 to be the bare minimum myself regardless of genre. 120 fps is nicer, but not a must as long as I can upkeep the settings and use tools like Special K to do some frametime magic and keep it smooth :)

4070 TI is a shitshow and 4060TI will be even more so, due to even narrower memory bus, so why even bother with 16 gigs on a 128 bit bus dGPU?

I was misled by 4070TI's review benchmarks. I pulled a trigger on it to replace my 3090 due to favorable poweer consumption and oh boy what a mistake that has been. My primarly use of these GPUs is for VR gaming at around 3124 x 3056 px*2 resolution (HP reverb G2's 1SS resolution) and at this res 4070TI's narrow bus comes into play, causing really bad micro stutters with huge ms spikes, making it borderline unusable for VR. True, fps averages of 4070TI are slightly better than that of 3090, but 3090 totally kills it in terms of smoothness, even when undervolted/underclocked to 1750MHz/750mV to get consuption down to 250W. The whole ADA gen with exception of 4090 is an utter joke :banghead:

Yep, that is exactly why I did not upgrade this generation. I'm unwilling to pay what they ask for the 4090, and no other GPU provides me with an upgrade worth my time. The 4080 isn't enough and the 7900 XTX has too many downsides for it to be worth it.

I'll be waiting for RDNA 4 and Blackwell, in the meantime, I upgraded my processor to something that will last a considerable amount of time and will be going after finally buying a super high end display.
 
Given price comparable cards can run the games at the chosen settings without stuttering or without having really ugly texture swapping it is an issue with the 3070. The fact you need to downgrade the IQ on the 3070 vs price comparable cards is not great.

This is only going to happen in more and more games as devs drop PS4 and Xbox One development and focus solely on PC, PS5 and Series X.
But you need to downgrade the IQ on the comparable card as well. A 6700xt for instance doesn't max out hogwarts for example. Not because of vram - but because of lack of performance, both raster and RT. So for 1440p you have to activate FSR to have decent framerate. So my question is very simple, does FSR Q + Ultra textures look better than DLSS Q + High textures that youll play on a 3060ti / 3070? That's what I wanna see tested before I conclude which card is better
 
Back
Top