• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4060 Ti Sees Price Cuts in Europe Just Hours After Launch

Does the 4060ti max out the ad106 chip? If so, not sure how a Super sku could be introduced, unless the host bus, bus width and memory configuration is changed.
Since there were no Super skus with the 30 series, my assumption is that those were a one-time run. Just my opinion, but I could be wrong. I am curious, what makes you think otherwise other than yields?
Just the tiny fact that everyone seems to hate all the GPUs below the 4080.
A refresh could mend this issue, assuming Nvidia can do something about the memory support.
 
Just the tiny fact that everyone seems to hate all the GPUs below the 4080.
A refresh could mend this issue, assuming Nvidia can do something about the memory support.
I think lowering the price for each tier product would be more beneficial than releasing a refresh (e.g. Super) to regain consumer confidence.
 
I think lowering the price for each tier product would be more beneficial than releasing a refresh (e.g. Super) to regain consumer confidence.
While on an emotional basis I agree with this... commercially that's one royal clusterfk.

- Every next launch MSRP is going to be met with big questions
- Nvidia's position as a salesman gets obliterated (I mean, dropping everything even half a tier in pricing this shortly after launch is like the market dealer you have to haggle with - you just know that price can be halved, whatever it is).
- There is no reason on the competitive front - AMDs whole line up would suddenly be standing there pointless altogether. So you're selling by undercutting, not because the product is worthwhile on its own, which is an opening shot for all out price war, and you don't know where the bottom lies, but Nvidia does know its operating cost is higher than AMD's, as is their initial investment, even if they can put it to use in a bigger market. We can take an educated guess here that Nvidia's GPU in general is also more expensive to craft, if you include the whole software dev behind it, monolithic die, etc.

This among many other reasons I probably don't know of, is the reason you will always see them refresh a lineup before doing what you suggest. This mechanic also highlights why and how AMD and Nvidia trail each other closely - neither benefit from going way out of line. Some semblance of balance preserves both their product value. Its not a cartel... but it damn well feels like one.
 
I think lowering the price for each tier product would be more beneficial than releasing a refresh (e.g. Super) to regain consumer confidence.
Not to Nvidia...

Then again, their profits from gaming products are down 38% YoY...
 
Not to Nvidia...

Then again, their profits from gaming products are down 38% YoY...

Unfortunately I don't think Nvidia sees the lower profits as an effect of high prices. I think it was the plan all along to lower production because demand is down. Because of the outlook on the economy and potential personal finance issues consumers just aren't spending money on new GPUs when they can make do for now with what they already have.
 
Leather Jacket went to the markets for some grubs
JHH.jpg
 
20 euros? :mad:

I opened up this thread in sheer excitement.... but.... 20 euros?
 
Nvidia doesn't care as much about gaming anymore. They are heavily invested in the AI field, which is currently experiencing a boom, and this focus is expected to continue for at least a couple of years. Despite losing ground in the gaming industry, their shares are still on the rise.
 
It's €440+ here. Plenty of 3060's for about €400. 4050's same price point. Tons of 3060ti's and also some 3060s at €500, I assume the stores got them at the height of the shortage and won't/can't sell them at a loss.
 
Tons of 3060ti's and also some 3060s at €500, I assume the stores got them at the height of the shortage and won't/can't sell them at a loss.
Hopefully, it's the same stores that canceled orders in late 2020 and early 2021, to sell these cards for 2-3 times the price to miners when the stock finally appeared.
 
The most anyone should pay for an 8GB entry level GPU is about $300. The 16GB variant should launch at $399. It still wouldn't exactly be a steal at $399 because it's performance is mediocre, but at least it won't be out of VRAM already.
 
I am not paying more than 200-250€ for 1080p entry level GPU.
When was the point when everybody talk about **60 GPU as an entry-level? Nonsense. What about are with the xx50/xx30/x10 GPUs?
 
What about are with the xx50/xx30/x10 GPUs?
xx50, xx30, and xx10 haven't really been a thing for a long time. As far as I know the 2000, 3000, and 4000 series have zero of those cards?

The last base model xx50 card that I remember actually being entry level price with decent performance was the 1050 which was like 7 or 8 years ago ?
When was the point when everybody talk about **60 GPU as an entry-level?
Probably RTX 2000 era when Nvidia phased those xx50,xx30,xx10 cards out
 
When was the point when everybody talk about **60 GPU as an entry-level? Nonsense. What about are with the xx50/xx30/x10 GPUs?
Codename of GPU means nothing to me. 4060 is made for 1080p gaming (that resolution is budget entry level) on medium to high settings over next 3 years (tops). xx60 cards used to be midrange but now they are overpriced entry level cards.

Cards below that power level are basically legacy GPUs made for retro gaming (7-10+ years old games). Why would anyone dish out 400-500€ (price of a 2k-4k modern console) for a GPU that can play solidly maybe Skyrim or GTA5? The only scenario I can think of is if old GPU died and even then I would look elsewhere or save a bit more cash or buy a console (even Switch can do that job).

1080p is lowest most common used resolution for gaming. - pure example of entry level
1440p is increasing it's presence and is considered a sweet spot for gaming at this time (monitors are accessible, HW is accessible, it's just Nvidia trying to overprice their GPUs and AMD following the way).
4k is the top, besides 8k which is not really a thing (those monitors and TVs are just coming down in price and it will take several years for them to be established an masse).

Just by this simple division of resolutions it's clear 1080p is entry level, 1440p is midrange (where most xx70 and some xx80 cards fall into) and 4k is at the top.
 
Codename of GPU means nothing to me. 4060 is made for 1080p gaming (that resolution is budget entry level) on medium to high settings over next 3 years (tops). xx60 cards used to be midrange but now they are overpriced entry level cards.
So true. If this were the 10-series, the 1080p budget card is the GTX 1050 which has 18% of the flaghship's cores for just 9% of it's price
  • Titan = 3584 cores, $1200
  • 1050 = 640 cores, $110
Let's compare to the 40-series; The 1080p budget card is (according to Nvidia) the 4060 which has 18% of the 4090's cores for 19% of its price
  • 4090 = 16384 cores, $1600
  • 4060 = 3072 cores, $300
Not only are we being charged twice as much for 1080p, it's a xx50 class of card, by both specification and relative performance.
 
"but quickly saw its price trimmed by 20€, down to 419€, in what could be very early signs that the card isn't exactly flying off the shelves"

You can't realisticly expect shops to lower prices after only a few hours based on sales performance.

More likely: they adjusted prices based on their counterparts prices to be more competitive than them.

4060 ti FE is out of stock btw
What, the MSRP one, what a surprise.

Nvidia no price cuts, OEM have to.

Sounds painful for OEMs
 
While on an emotional basis I agree with this... commercially that's one royal clusterfk.

- Every next launch MSRP is going to be met with big questions
- Nvidia's position as a salesman gets obliterated (I mean, dropping everything even half a tier in pricing this shortly after launch is like the market dealer you have to haggle with - you just know that price can be halved, whatever it is).
- There is no reason on the competitive front - AMDs whole line up would suddenly be standing there pointless altogether. So you're selling by undercutting, not because the product is worthwhile on its own, which is an opening shot for all out price war, and you don't know where the bottom lies, but Nvidia does know its operating cost is higher than AMD's, as is their initial investment, even if they can put it to use in a bigger market. We can take an educated guess here that Nvidia's GPU in general is also more expensive to craft, if you include the whole software dev behind it, monolithic die, etc.
I agree with the points mentioned, which makes sense why a price cut at this stage would happen. Nvidia has to recoup their costs from R&D. Thanks for the response.
This among many other reasons I probably don't know of, is the reason you will always see them refresh a lineup before doing what you suggest. This mechanic also highlights why and how AMD and Nvidia trail each other closely - neither benefit from going way out of line. Some semblance of balance preserves both their product value. Its not a cartel... but it damn well feels like one.
Refresh their lineup? Are you referring to an interm refresh like a Super series or whole new lineup (e..g 4000 > 5000 series) when a price cut would occur? Nvidia recently has done one interm refresh with the 2000 Super series. If I recall, 2000 series cards did receive a price cut when the Super series was released. I just do not know if Nvidia will release one as there are many factors involved. I agree we just see a leapfrog in performance between the two gpu makers. I am sure you remember Sandy Bridge from Intel over a decade ago. That was the last big jump in performance that I remember of a CPU. I had an Ivy Bridge, which covered my needs for close to 10 years. I upgraded to Ryzen since my z77 board died.
Let's wait and see.
 
I am sure you remember Sandy Bridge from Intel over a decade ago. That was the last big jump in performance that I remember of a CPU.
Alder Lake and all iterations of Zen since Zen 2 have been at least as impressive as Sandy Bridge in surpassing their predecessors.
 
Alder Lake and all iterations of Zen since Zen 2 have been at least as impressive as Sandy Bridge in surpassing their predecessors.
I would like to see how well Alder Lake and my Zen3 fair over the years to come, hopefully both have similar longevity. My 3770k was more than enough for my needs, which is why I was impressed how long it was viable. My previous build before my 3770k was from a Core2 quad.
 
I agree with the points mentioned, which makes sense why a price cut at this stage would happen. Nvidia has to recoup their costs from R&D. Thanks for the response.

Refresh their lineup? Are you referring to an interm refresh like a Super series or whole new lineup (e..g 4000 > 5000 series) when a price cut would occur? Nvidia recently has done one interm refresh with the 2000 Super series. If I recall, 2000 series cards did receive a price cut when the Super series was released. I just do not know if Nvidia will release one as there are many factors involved. I agree we just see a leapfrog in performance between the two gpu makers. I am sure you remember Sandy Bridge from Intel over a decade ago. That was the last big jump in performance that I remember of a CPU. I had an Ivy Bridge, which covered my needs for close to 10 years. I upgraded to Ryzen since my z77 board died.
Let's wait and see.
Nvidia has refreshed more often. Pascal was refreshed with 11Gbps cards for example, and that was also the trigger to drop pricing on the 'slower' 1080. They also released the 1600 series alongside Turing which was obviously influenced by price/die size on 2000 cards, even though the 2060 was a pretty decent x60 for its money. Ampere gained double VRAM versions of numerous cards. Ada is going to get a 16GB 4060. I'd say all of these are same gen refreshes of one kind or another - they're all slight adjustments to cater to the market better with essentially the same product stack.

Yep... ran Ivy too :) The days. That quad core is still working in my HTPC lol.
 
xx50, xx30, and xx10 haven't really been a thing for a long time. As far as I know the 2000, 3000, and 4000 series have zero of those cards?

The last base model xx50 card that I remember actually being entry level price with decent performance was the 1050 which was like 7 or 8 years ago ?

Probably RTX 2000 era when Nvidia phased those xx50,xx30,xx10 cards out
Uhhhh did you forget the GTX 1650, GTX 1650 Ti, GTX 1650 Super, RTX 3050; All superseded the GTX 1050 and 1050 Ti
 
Back
Top