• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti

I honestly think the issue with this generation of gpu's so far is two fold. One is the performance increase isn't that significant and the gpu companies are largely trying to use Ai to convince people that using a DLSS or FSR etc is a great solution. Problem with that is you still end up with the drawbacks you already had if you are running a lower end gpu. So in the end it isn't that helpful.

The second problem is pretty drastic. Every gen for a decent amount of time you expected to make a leap in performance and typically in power consumption. So say you had been the owner of a GTX 970 back in the day you expected that when the 1070 came out it would be faster than the previous gen RTX 980 but also do it with better efficiency and less power draw. This is starting to not be the case. However, even if this was still the case gen to gen the drastic issue is that they aren't just trying to give you that but they are doing what little they are by making you pay for it.

What use would a 1070 of been if it matched the 980 but was a little more power efficient but the 1070 cost what the 980 did? That is the issue. Each gen they give us recently what gains we get they make you pay for them by raising the price each gen. So all you end up with is a little more efficient gpu and they force you into some Software via Ai etc and then don't backport the software portion to try and get you to upgrade.

They just are giving us half baked everything really and then saying oh by the way you have to pay the increased cost as well.
 
Production costs and inflation still don't justify why GPUs are the only component that has tripled in price in the last 10 years.
The truth is that during the mining rush they got away with absurd prices and they don't want to come down now.
 
Production costs and inflation still don't justify why GPUs are the only component that has tripled in price in the last 10 years.
The truth is that during the mining rush they got away with absurd prices and they don't want to come down now.

I share the same sentiments. Pre-owners and scalpers were selling top end 30-series for rediculous sums of money during Covid/Crypto. Even i was surprised to see what people were prepared to pay for these cards. Nvidia got a whiff of it and now we have "in-house scalping" and "never ending pandemic crisis prices" and "greedy fat self-inflated inflation" on our hands. From a business perspective, Nvidia has hit GOLD!! and the green monster and its shareholders won't let it slide so easily until the cracks emerge. Yep cracks opposed to AMD or Intel catching up... they're no different in the grand scheme of Gold winning up-the-price-anti insatiability. Only it took the Greens to set the loathsome bar so high.
 
Production costs and inflation still don't justify why GPUs are the only component that has tripled in price in the last 10 years.
The truth is that during the mining rush they got away with absurd prices and they don't want to come down now.
It certainly feels that way. Realistically I suspect it's a little bit of greed, economics, foundry demand, and increased TDP that contribute to higher prices in equal measure.
  • Greed is evident, the fact that Nvidia have caved to outcry of greed with the 4080 12GB and now the near-immediate price-cuts on the 4070.
  • Economics have changed dramatically in the just the last half decade. Chinese wages, exchange rates, trade wars and import duties, legislation, global shipping costs etc.
  • Foundry demand, with both Nvidia and Intel being two insanely-rich customers added to the already-expensive bidding war for TSMC capacity against AMD, Apple, and Qualcomm.
  • Todays midrange cards use 175-250W, which means an appropriately-priced cooler and VRM phases. Going back to the 10-series, for example, the 1060 6GB was just 120W.
 
Ah yes, the difference in averages again, yes, I agree on those. Third time now you've repeated that ;) Like I said, enjoy that 8GB while it lasts. I'll just eat more popcorn as the years go by and whole groups of people suddenly feel 'forced' to upgrade to yet another Nvidia midrange with lacking memory. This has been happening since Maxwell. It happened to 4GB Fury X owners and 1060 3GB owners. And it'll happen to 8GB Ampere owners too. At that point you're also left with a card that has abysmal resale value because, quite simply, it won't do well anymore. That sentiment has clearly already begun to land, as well.
I have two years of satisfaction with this video card and will have at least two more. You took into account the release years of those games, well done! , but it seems you don't know that some of them are the most used in the world online. I'll let you play with 128TB online, 4GB is enough for me.

Finally
1. I play WoT and AW (exclusively online) with the UHD 770. The 8GB vRAM of the RTX 3070Ti is probably to blame for my lack of superior performance as a gamer. It definitely increases the rating with 24GB, I reach the top 10.

2. For Blender, going from 3070 Ti to 7900XTX 24GB (oooooooohooooo!!!) just means a downgrade. People from Puget say it.
 
Actually it was 2019-2020:-

"Most relevant to potential buyers of the GTX 1660 Ti is the GeForce GTX 1660 Super, which delivers similar performance to the 1660 Ti, at a lower starting price of $229. At this writing, that's about $30 less than the lowest-price GTX 1660 Ti".
https://www.tomshardware.com/reviews/nvidia-geforce-gtx-1660-ti-turing,6002.html
Technically correct, but hard to class the 16-series as equivalent to the earlier 20-series. Realistically, all of the 16-series were below even the base-model 2060. Nvidia just messed around with the naming that generation and spread Turing over two series rather than one.
 
Actually it was 2019-2020:-

"Most relevant to potential buyers of the GTX 1660 Ti is the GeForce GTX 1660 Super, which delivers similar performance to the 1660 Ti, at a lower starting price of $229. At this writing, that's about $30 less than the lowest-price GTX 1660 Ti".
https://www.tomshardware.com/reviews/nvidia-geforce-gtx-1660-ti-turing,6002.html
I wouldn't really classify the 16xx series as a xx60ti card.

The closest they got that gen to a 2060 stop gap was arguably the 2060 Super which came out only 8 months after the 2060.

Technically correct, but hard to class the 16-series as equivalent to the earlier 20-series. Realistically, all of the 16-series were below even the base-model 2060. Nvidia just messed around with the naming that generation and spread Turing over two series rather than one.
Exactly. The closest was got to a midrange Ti card was the 2060 super which was a $400-430 card.
 
I have two years of satisfaction with this video card and will have at least two more. You took into account the release years of those games, well done! , but it seems you don't know that some of them are the most used in the world online. I'll let you play with 128TB online, 4GB is enough for me.

Finally
1. I play WoT and AW (exclusively online) with the UHD 770. The 8GB vRAM of the RTX 3070Ti is probably to blame for my lack of superior performance as a gamer. It definitely increases the rating with 24GB, I reach the top 10.

2. For Blender, going from 3070 Ti to 7900XTX 24GB (oooooooohooooo!!!) just means a downgrade. People from Puget say it.
Its all good then!
 
Its all good then!
It's good that we understood each other. :slap:
I'm reposting this printscreen, don't forget about it. Added new games, video cards are two years old, 16GB didn't make a difference. And they won't do it either in 2024 or 2025, maybe in 2030. Then, this memory surplus will help the old GPU to render with a 10% boost, with a jump from 10 to 11 FPS. If you don't believe it, run the new games in extreme detail with Radeon VegaI 16GB. Even in 4K, because it has enough memory. :D

iunie 2021 no rt.jpg


Apr 12th2023 no RT.jpg
 
Last edited:
This is low quality post. lolololololololololololololololololololololololololololololololol. 450 for 60 series. Nice.
 
It's good that we understood each other. :slap:
I'm reposting this printscreen, don't forget about it. Added new games, video cards are two years old, 16GB didn't make a difference. And they won't do it either in 2024 or 2025, maybe in 2030. Then, this memory surplus will help the old GPU to render with a 10% boost, with a jump from 10 to 11 FPS. If you don't believe it, run the new games in extreme detail with Radeon VegaI 16GB. Even in 4K, because it has enough memory. :D

View attachment 292485

View attachment 292483
LOL! Vram amount doesn't matter until you run out of it, then you literally can't play the game or you have insane stutters! We already have 5 games already that use more than 8GB of vram, in fact going towards 12GB realistically and we are early 2023, most upcoming games are going to use 10GB or more.

8GB is ENTRY level amount, 12GB is the bare minimum for mid range with 16GB being realistically the target for mid range and upper mid range.
 
This is low quality post. lolololololololololololololololololololololololololololololololol. 450 for 60 series. Nice.
6650XT launched at $400 (and couldn't be had at that price) and it wasn't faster than a 3060Ti.
 
LOL! Vram amount doesn't matter until you run out of it, then you literally can't play the game or you have insane stutters! We already have 5 games already that use more than 8GB of vram, in fact going towards 12GB realistically and we are early 2023, most upcoming games are going to use 10GB or more.

8GB is ENTRY level amount, 12GB is the bare minimum for mid range with 16GB being realistically the target for mid range and upper mid range.

The reason for the stutters is when you run out of VRAM the GPU begins to use System RAM also which is much, much slower than VRAM.
 
8GB is ENTRY level amount, 12GB is the bare minimum for mid range with 16GB being realistically the target for mid range and upper mid range.
Because when you play the latest and greatest at high res, ultra quality, you totally buy a mid-range video card, right? :wtf:
 
considering the prices of high end GPUs it seems completely reasonable to aim for the mid-range.
Without lowering your expectations? Is that what passes as "reasonable" these days?
 
Without lowering your expectations? Is that what passes as "reasonable" these days?
At release a midrange GPU was definitely sufficient to play everything, and I used my 1080 a year post release to max out everything at 1080p at 60-100 FPS or better.

So yes, a midrange GPU today should definitely run everything at pretty much v high or ultra. At 1080p. And with very minor concessions at 1440p.

That is definitely not unreasonable, its the norm. We dont expect 4K maxed in 2023... but 1440p pretty much maxed? Yep. Its even Nvidias punchline for the 4070.
 
At release a midrange GPU was definitely sufficient to play everything, and I used my 1080 a year post release to max out everything at 1080p at 60-100 FPS or better.

So yes, a midrange GPU today should definitely run everything at pretty much v high or ultra. At 1080p. And with very minor concessions at 1440p.

That is definitely not unreasonable, its the norm. We dont expect 4K maxed in 2023... but 1440p pretty much maxed? Yep. Its even Nvidias punchline for the 4070.
4070? We're discussing 4060Ti here.

As for expectations, titles from id traditionally could not be maxxed out at launch on anything, the hardware was built yet. Happened to at least Quake, Doom3 and Rage.
I do get what you're saying. I always bought mid-rangers and I could play everything on them. At the same time, I had to lower quality from time to time, but they were powerful enough that I could always find a combination of setting to lower that didn't make a distinguishable visual difference.
 
4070? We're discussing 4060Ti here.

As for expectations, titles from id traditionally could not be maxxed out at launch on anything, the hardware was built yet. Happened to at least Quake, Doom3 and Rage.
I do get what you're saying. I always bought mid-rangers and I could play everything on them. At the same time, I had to lower quality from time to time, but they were powerful enough that I could always find a combination of setting to lower that didn't make a distinguishable visual difference.
those titles, just like Crysis, are exceptions. Most other games have usually been playable at max with midrange GPUs like the 760, 960, 1060 or AMD equivalent.
 
those titles, just like Crysis, are exceptions. Most other games have usually been playable at max with midrange GPUs like the 760, 960, 1060 or AMD equivalent.
I just told you that I always bought into the midrange and had to lower setting more often than not. But ok.
 
Still more than I paid for my RX6800
 
Let's reframe the cost here.

If you buy a $1000 card every 2 years, that works out to be about $42 a month. Does that seem like an astronomical, must-live-in-your-parent's-basement kind of cost to you?
42$ for the GPU, 21$ for the whole System (1000$ every 4 Years), 10-20$ for the electricity = 73-83$ a month :laugh:

Now its related by the country where u life
 
4070? We're discussing 4060Ti here.

As for expectations, titles from id traditionally could not be maxxed out at launch on anything, the hardware was built yet. Happened to at least Quake, Doom3 and Rage.
I do get what you're saying. I always bought mid-rangers and I could play everything on them. At the same time, I had to lower quality from time to time, but they were powerful enough that I could always find a combination of setting to lower that didn't make a distinguishable visual difference.
Exceptions, right. I mean, right now what you get is a basic, off the shelf engine that will push features that are going to destroy these cards. Consoles will carry that push.

As for discussing the 4060ti - yes, so you won't expect to max out 1440p, you will tweak a little more. But even that won't carry you, because 8GB simply won't suffice. And even 1080p might turn out to be problematic pretty soon. I was already seeing lots of instances where its 7+ GB in use on my GTX 1080. Now that I have more VRAM, I see it run way over more often than not - 13GB even isn't an exception. The gap's getting pretty large pretty quickly.

This 4060ti might turn out to be 3GB 1060, versus the 6GB where the latter can simply run more games proper, even despite what settings you have to move down to.

Here's Cyberpunk 3440x1440 on max settings, no RT and quality FSR 2: 100+ FPS virtually everywhere, and just over 8GB allocated; it runs up if you go outside, 8.8GB happens. I think that clearly shows that VRAM will be the limiting factor here on x60's because you've definitely got the core oomph to run at these settings at least at 40~50 FPS and you can even add a sprinkle of RT on top if you're happy with running FSR Balanced (or its DLSS equivalent which produces equal or better FPS). And sure - even with 8.3GB you can run the game fine on 8. But this is a 2021 title.

Cyberpunk2077_2023_04_21_20_22_24_817.jpg


Here's RT on / Psycho, FSR Quality - 9.4 GB; still 55 FPS and this is on AMD, we know Nvidia runs better RT frames especially in Cyberpunk with multiple effects.

Also... lol. Why would I even bother using this for that FPS hit :D I've just been playing this game and even Path Tracing (runs at a whoppin 14 FPS here :D) looks almost identical, I have to crawl into the screen to appreciate the differences, and in many cases I preferred the raster image for its overall presentation and lighting balance. Looking at the sun - and its low in the sky a LOT of the time - is ridiculous with RT on. And there is no sunglasses mode.

But yeah... sacrifice IQ because you lack VRAM... would seem like a total waste of GPU to me. You can run the game at playable frames and virtually max if you have sufficient VRAM.

Cyberpunk2077_2023_04_21_20_26_06_583.jpg
 
Last edited:
Because when you play the latest and greatest at high res, ultra quality, you totally buy a mid-range video card, right? :wtf:
Yes! Mid range has historically been capable of running the latest games at highest settings and relatively good resolution. I mean the GTX 970 I remember used to run almost all games at 60+fps at 1080p and most games at 1440p as well. The GTX 1070 was perfectly capable of running all of the latest games at 1440p and 60+fps, same with the GTX 2070, never mind that the 2070 was a step in the wrong direction and had bad value, it was still capable of running 1440p games at 60+fps.

All of these cards have been able to run most AAA games at least 3 years after release at either the highest or a tier bellow highest at 60+fps.

With AMD its been even better as they've always provided a lot more vram and room to grow, we saw it with RX 500, RX 5000, etc... where these series kept becoming better and better.

The most direct example of a bad value and DOA card is probably the 1060 3GB, just a year after releasing that card could not a third of the games at highest textures and, most 4GB got obliterated then as well, I remember I just bought the GTX 1060 6GB and ROTTR was out and that games used up to 6.5GB of vram. Assasint Creed something used 6+ GB, so even the 1060 6GB was on the edge just 2 years later!

These 8GB is basic level, its the lowest entry point, anything else and games are going to be stuttering, crashing, have high frame times, etc....

For GPU's that have still not come out yet, are yet to release in a month or two, having only 8GB as so-called mid tier cards is blatantly stupid and fraudulent.
 
Back
Top