Friday, June 22nd 2018
NVIDIA's Next Gen GPU Launch Held Back to Drain Excess, Costly Built-up Inventory?
We've previously touched upon whether or not NVIDIA should launch their 1100 or 2000 series of graphics cards ahead of any new product from AMD. At the time, I wrote that I only saw benefits to that approach: earlier time to market -> satisfaction of upgrade itches and entrenchment as the only latest-gen manufacturer -> raised costs over lack of competition -> ability to respond by lowering prices after achieving a war-chest of profits. However, reports of a costly NVIDIA mistake in overestimating demand for its Pascal GPUs does lend some other shades to the whole equation.
Write-offs in inventory are costly (just ask Microsoft), and apparently, NVIDIA has found itself in a miscalculating demeanor: overestimating gamers' and miners' demand for their graphics cards. When it comes to gamers, NVIDIA's Pascal graphics cards have been available in the market for two years now - it's relatively safe to say that the majority of gamers who needed higher-performance graphics cards have already taken the plunge. As to miners, the cryptocurrency market contraction (and other factors) has led to a taper-out of graphics card demand for this particular workload. The result? NVIDIA's demand overestimation has led, according to Seeking Alpha, to a "top three" Taiwan OEM returning 300,000 GPUs to NVIDIA, and "aggressively" increased GDDR5 buying orders from the company, suggesting an excess stock of GPUs that need to be made into boards.With no competition on the horizon from AMD, it makes sense that NVIDIA would give the market time to assimilate their excess graphics cards. A good solution for excess inventory would be price-cuts, but the absence of competition brings that to a halt: NVIDIA's solutions are selling well in the face of current AMD products in the market, and as such, there is no need to artificially increase demand - and lower ASP in the meantime. Should some sort of pressure be applied, NVIDIA can lower MSRP at a snap of its proverbial fingers.Of course, this begs the question of what exactly will NVIDIA do with its R&D on other graphics product generations that are falling further and further into the future. Volta never saw the light of day in consumer graphics card products, and we're already talking about the launch of a Turing or Ampere architecture from the company - hoping it would be released in Q3 of this year. There is R&D investment that will lose its impact and chance to generate the revenue expected at its inception. Sure, revenue keeps coming in from older generation hardware - but these delays allow the competition to try and leapfrog, performance and technology-wise, the interim NVIDIA architectures that haven't been released to market, setting their sights on future releases. We're left with an NVIDIA that only partially capitalized their Volta R&D in the pro and server segment, for example, and wasted funds that could be better spent elsewhere. But opportunity cost is part of this business, right?
Sources:
Seeking Alpha, via TechSpot
Write-offs in inventory are costly (just ask Microsoft), and apparently, NVIDIA has found itself in a miscalculating demeanor: overestimating gamers' and miners' demand for their graphics cards. When it comes to gamers, NVIDIA's Pascal graphics cards have been available in the market for two years now - it's relatively safe to say that the majority of gamers who needed higher-performance graphics cards have already taken the plunge. As to miners, the cryptocurrency market contraction (and other factors) has led to a taper-out of graphics card demand for this particular workload. The result? NVIDIA's demand overestimation has led, according to Seeking Alpha, to a "top three" Taiwan OEM returning 300,000 GPUs to NVIDIA, and "aggressively" increased GDDR5 buying orders from the company, suggesting an excess stock of GPUs that need to be made into boards.With no competition on the horizon from AMD, it makes sense that NVIDIA would give the market time to assimilate their excess graphics cards. A good solution for excess inventory would be price-cuts, but the absence of competition brings that to a halt: NVIDIA's solutions are selling well in the face of current AMD products in the market, and as such, there is no need to artificially increase demand - and lower ASP in the meantime. Should some sort of pressure be applied, NVIDIA can lower MSRP at a snap of its proverbial fingers.Of course, this begs the question of what exactly will NVIDIA do with its R&D on other graphics product generations that are falling further and further into the future. Volta never saw the light of day in consumer graphics card products, and we're already talking about the launch of a Turing or Ampere architecture from the company - hoping it would be released in Q3 of this year. There is R&D investment that will lose its impact and chance to generate the revenue expected at its inception. Sure, revenue keeps coming in from older generation hardware - but these delays allow the competition to try and leapfrog, performance and technology-wise, the interim NVIDIA architectures that haven't been released to market, setting their sights on future releases. We're left with an NVIDIA that only partially capitalized their Volta R&D in the pro and server segment, for example, and wasted funds that could be better spent elsewhere. But opportunity cost is part of this business, right?
70 Comments on NVIDIA's Next Gen GPU Launch Held Back to Drain Excess, Costly Built-up Inventory?
As to the gp104 those are only something Alien brand, Asus ROG, or boutique builders could move, and those don't have near the volume.
So now there's no demand when there was so much pent up demand?
So now we are suddenly over supplied when we were under supplied just last month?
Why would the vendors just not sell the cards? Makes no sense.
The 1180 is coming out at end of July so...
I took it as just "one" of the top 3 vendors sent back 30K...
We haven't heard if others are trying to juggle their inventories. This would indicate that particular AIB couldn't A) get enough memory to build them into cards; or B) felt they wouldn't burn through them fast enough without discounting, and that was something Nvidia couldn't acquiesce; while C) some mix in the middle.
Nvidia figures they can sell them fairly quickly as they are better positioned to 'strong-arm' enough GDDR fast and move them to other AIB's. Sure they could've helped with Rebates, but probably didn't want to upset other AIB's who have parts that wouldn't move if 'X' company is getting helped. So buying them back securing memory and then spreading them across all AIB's who would like to partake makes it more fair. And, Nvidia can manage allotment and better control the pricing, even if it comes to rebates they are equally disappeared for those that participate.
Maybe, or they are saying they believe so strongly they are shorting it themselves.
But a valid point to bring up non-the-less.
SA seems to also be giving multiple viewpoints on this topic.
seekingalpha.com/article/4182761-another-reason-hedge-nvidia
They have also called out those shorting companies like Unifi in the past.
I won't refuse
We are not rich enough to be buying the cards the miners have left for us, at over MSRP.
I'm so happy that we have nVidias "love" to keep us warm, while we wait.
If an OEM returns 300.000 graphics cards, it's because they no longer sell machines with these cards, which means they are older cards (Maxwell and Kepler).
And contrary to what is implied in the article, this has nothing to do with mining.
I'm sure it's a fine card, but damn turn up on time.
Not a single leak from any of the chip makers, OEM card makers, these are companies with many hundreds of employees they aren't the KGB or CIA.
The entire rumour market is based on people's fantasies, when a new product is forthcoming we will first see it in a Nvidia roadmap,
Jensen is a human headline, can't help himself when he has a new product forthcoming.
The next-gen card will be April-June 2019, Nvidia has slowed development, they have zero competition.
and are waiting for some momentum from AMD or INTEL.
The next sign of a new card will be big discounts on last gen cards, the GTX1080 dropping to $499USD, then to $450USD.
These price reductions will occur pre Christmas 2018 at $499USD to allow sellout of old stock at the Christmas binge,
then in February card prices make the final fall to $450USD when the consumer will know new cards are coming.
But expecting the 1160 to be the equal of 1070/80 that means it has to go down to 299$ before it makes sense..
How does this really have ANYTHING to do with NVidia? I mean, other than that they have to make a refund and have the GPUs sitting in a warehouse. Also, from what is said, it's bare GPUs and not complete cards, or the memory bit would not have been mentioned as its not relevant...