Raevenlord
News Editor
- Joined
- Aug 12, 2016
- Messages
- 3,755 (1.18/day)
- Location
- Portugal
System Name | The Ryzening |
---|---|
Processor | AMD Ryzen 9 5900X |
Motherboard | MSI X570 MAG TOMAHAWK |
Cooling | Lian Li Galahad 360mm AIO |
Memory | 32 GB G.Skill Trident Z F4-3733 (4x 8 GB) |
Video Card(s) | Gigabyte RTX 3070 Ti |
Storage | Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB |
Display(s) | Acer Nitro VG270UP (1440p 144 Hz IPS) |
Case | Lian Li O11DX Dynamic White |
Audio Device(s) | iFi Audio Zen DAC |
Power Supply | Seasonic Focus+ 750 W |
Mouse | Cooler Master Masterkeys Lite L |
Keyboard | Cooler Master Masterkeys Lite L |
Software | Windows 10 x64 |
We've previously touched upon whether or not NVIDIA should launch their 1100 or 2000 series of graphics cards ahead of any new product from AMD. At the time, I wrote that I only saw benefits to that approach: earlier time to market -> satisfaction of upgrade itches and entrenchment as the only latest-gen manufacturer -> raised costs over lack of competition -> ability to respond by lowering prices after achieving a war-chest of profits. However, reports of a costly NVIDIA mistake in overestimating demand for its Pascal GPUs does lend some other shades to the whole equation.
Write-offs in inventory are costly (just ask Microsoft), and apparently, NVIDIA has found itself in a miscalculating demeanor: overestimating gamers' and miners' demand for their graphics cards. When it comes to gamers, NVIDIA's Pascal graphics cards have been available in the market for two years now - it's relatively safe to say that the majority of gamers who needed higher-performance graphics cards have already taken the plunge. As to miners, the cryptocurrency market contraction (and other factors) has led to a taper-out of graphics card demand for this particular workload. The result? NVIDIA's demand overestimation has led, according to Seeking Alpha, to a "top three" Taiwan OEM returning 300,000 GPUs to NVIDIA, and "aggressively" increased GDDR5 buying orders from the company, suggesting an excess stock of GPUs that need to be made into boards.
With no competition on the horizon from AMD, it makes sense that NVIDIA would give the market time to assimilate their excess graphics cards. A good solution for excess inventory would be price-cuts, but the absence of competition brings that to a halt: NVIDIA's solutions are selling well in the face of current AMD products in the market, and as such, there is no need to artificially increase demand - and lower ASP in the meantime. Should some sort of pressure be applied, NVIDIA can lower MSRP at a snap of its proverbial fingers.
Of course, this begs the question of what exactly will NVIDIA do with its R&D on other graphics product generations that are falling further and further into the future. Volta never saw the light of day in consumer graphics card products, and we're already talking about the launch of a Turing or Ampere architecture from the company - hoping it would be released in Q3 of this year. There is R&D investment that will lose its impact and chance to generate the revenue expected at its inception. Sure, revenue keeps coming in from older generation hardware - but these delays allow the competition to try and leapfrog, performance and technology-wise, the interim NVIDIA architectures that haven't been released to market, setting their sights on future releases. We're left with an NVIDIA that only partially capitalized their Volta R&D in the pro and server segment, for example, and wasted funds that could be better spent elsewhere. But opportunity cost is part of this business, right?
View at TechPowerUp Main Site
Write-offs in inventory are costly (just ask Microsoft), and apparently, NVIDIA has found itself in a miscalculating demeanor: overestimating gamers' and miners' demand for their graphics cards. When it comes to gamers, NVIDIA's Pascal graphics cards have been available in the market for two years now - it's relatively safe to say that the majority of gamers who needed higher-performance graphics cards have already taken the plunge. As to miners, the cryptocurrency market contraction (and other factors) has led to a taper-out of graphics card demand for this particular workload. The result? NVIDIA's demand overestimation has led, according to Seeking Alpha, to a "top three" Taiwan OEM returning 300,000 GPUs to NVIDIA, and "aggressively" increased GDDR5 buying orders from the company, suggesting an excess stock of GPUs that need to be made into boards.



With no competition on the horizon from AMD, it makes sense that NVIDIA would give the market time to assimilate their excess graphics cards. A good solution for excess inventory would be price-cuts, but the absence of competition brings that to a halt: NVIDIA's solutions are selling well in the face of current AMD products in the market, and as such, there is no need to artificially increase demand - and lower ASP in the meantime. Should some sort of pressure be applied, NVIDIA can lower MSRP at a snap of its proverbial fingers.

Of course, this begs the question of what exactly will NVIDIA do with its R&D on other graphics product generations that are falling further and further into the future. Volta never saw the light of day in consumer graphics card products, and we're already talking about the launch of a Turing or Ampere architecture from the company - hoping it would be released in Q3 of this year. There is R&D investment that will lose its impact and chance to generate the revenue expected at its inception. Sure, revenue keeps coming in from older generation hardware - but these delays allow the competition to try and leapfrog, performance and technology-wise, the interim NVIDIA architectures that haven't been released to market, setting their sights on future releases. We're left with an NVIDIA that only partially capitalized their Volta R&D in the pro and server segment, for example, and wasted funds that could be better spent elsewhere. But opportunity cost is part of this business, right?
View at TechPowerUp Main Site