Wednesday, January 1st 2014
Why the Litecoin Craze Hurts More Than Helps Brand AMD Radeon
Price wars between GPU makers are something to look forward to each year, as that's typically when you get the best bang for your buck. Such an optimal time to buy new graphics cards usually comes when both AMD and NVIDIA have launched a new graphics card lineup, each. AMD tends to launch its lineup first, followed by NVIDIA, which then begins a cycle of proactive and reactive price-cuts between the two, which churns up the $300 price-performance sweet-spot so well, that a purchase from that segment usually sets your PC up for the following three years. 2013-14 saw a major disruption to this cycle, Litecoin mining. Litecoin miners will hurt more than help brand AMD Radeon, here's why.Litecoin miners are buying up AMD Radeon GPUs. The reason being, that GPUs based on the Graphics CoreNext architecture are exceptionally good at processing P2P currency hash workloads, the only way just about anyone with a computer can create real money by putting their CPU or GPU to work. CPUs simply won't cut it with such processing loads anymore, despite being multi-core and equipped with AES-NI instruction sets.
GPUs can inherently handle parallelized workloads better, and it just so happens that investing in AMD Radeon GPUs is profitable for Litecoin miners, because they pay for themselves, and go on to generate revenue, when deployed in a good enough scale. Graphics CoreNext is the only GPU architecture that restores competitiveness of GPUs against ASICs, chips purpose-built to handle peer-to-peer currency validation hash processing.
Litecoin, like Bitcoin, is a peer-to-peer currency. It's decentralized, in that there's no central bank or monetary authority. The U.S. Federal Reserve governs the U.S. Dollar, for example. A triad of HSBC, Standard Chartered, and Bank of China issues the Hong Kong Dollar; Euro by the European Central Bank, and most countries have constitutional bodies set up to handle their currencies. Every transaction using a currency, including buying soda from a vending machine using loose change, is vouched for by its central bank. Litecoin doesn't have that, and so it relies on a distributed computing network to validate each transaction, and the integrity of each unit of the currency, with the wallet it's part of.
Unlike older distributed computing ventures like Folding@Home and WCG, which continue to draw people to donate some of their CPU/GPU time for charitable causes like processing scientific problems; the Bitcoin and Litecoin networks pay people who participate in their computing networks. They're paid in, well, Bitcoins and Litecoins, respectively. The beauty of it all? Not only can you pay for some goods and services with these currencies, but also exchange them for your everyday currency. They convert to U.S. Dollar, and you probably can convert a U.S. Dollar to any other currency on the planet.
The faster you process P2P currency validation loads, the more load is directed toward you, and the more you earn. Performance per Dollar immediately becomes the king here. Litecoin.info compiled an exhaustive list of AMD and NVIDIA GPUs, sorted by their Litecoin Hash processing performance. You'll note how at reference clock-speeds, NVIDIA's GeForce GTX Titan crunches just 320 kH/s (kilo-hash per second), while a Radeon R9 290X, at reference base-clock speeds, yields over three times that, at 980 kH/s. The GeForce GTX 780 Ti in the comparison yields 430 kH/s, but its clock speeds are not mentioned, and so you can't take its numbers at face-value. Even for a moment if we assume that the $650 GTX 780 Ti is running at reference speeds, you can still work out a huge price-performance gap between it and the $550 R9 290X. This, we believe, has led to some North American retailers getting opportunistic, who inflated the retail price of the R9 290X to prices close to $700, and the R9 290 (non-X), close to $500, from the $399 MSRP it launched with.
These hikes in prices of AMD Radeon R9 series products needn't be in reaction to a hike in demand, and retailers have the luxury of assuming that anyone who is buying a Graphics CoreNext-based GPU is doing so for Litecoin/Bitcoin. And so we find the argument that Litecoin mining has caused a shortage in AMD Radeon inventories, which is responsible for the price hike, absolute horse poo. More so, because AMD's chips not just destroy NVIDIA's, but also go up against some purpose-built ASIC boards, on the cost-performance metric.
Yet another reason we believe that the hike in AMD Radeon prices is not a result of inventory shortage, is because of pricing in other markets. Retailers in the EU apparently have healthy inventories of AMD Radeon, or at least the unaffected prices of graphics cards there seem to suggest that, if we were to believe the short-supply argument. We don't think Europeans aren't enamored by P2P currency, or the art of making money, but that European retailers aren't getting cocky about pricing their AMD Radeon inventory to end users, at least not yet. Eventually, bad pricing of AMD Radeon may catch up in Europe.
That brings us to the operational portion of this OP-ED. How the P2P currency craze hurts AMD more than helps it. AMD isn't manufacturing "Hawaii" silicon on a war footing. There are no indications that the company is scaling up supply to the "demand." The inflation in AMD Radeon prices appears to be backed more by the chips' huge price-performance advantage over NVIDIA at P2P currency processing, rather than short-supply. Whoever is into Litecoin processing, is apparently willing to cough up $700 for an R9 290X.
What this does, is it makes AMD Radeon prohibitively expensive for the target market of AMD Radeon, gamers and PC enthusiasts. Price-performance gaps between AMD and NVIDIA are tight and mangled; when it comes to the stuff Radeon and GeForce are actually designed for, to render 3D graphics for games. Fewer gamers and enthusiasts will buy AMD Radeon from here on out, as a result of the Litecoin craze. In the worst case scenario, this could give NVIDIA the opportunity to arbitrarily raise prices of GeForce GTX products slightly, while still maintaining higher price-performance at gaming, if not P2P currency processing.
Here's what AMD could try, to wriggle itself out of this mess - fight fire with fire, and build low-cost, limited edition AMD Stream GPGPU boards based on the Graphics CoreNext architecture, which offer higher price-performance ratios (at P2P currency processing) than even its own Radeon R9 series graphics cards. Those AMD Stream boards could be based on chips that are purpose-built for P2P currency processing loads, don't come with too much memory, and lack unnecessary components, so they could be sold at relatively low prices, and beat Radeon R9 series at price-performance. Again, there are two price-performance ratios at play here, one at P2P currency processing, and one at gaming, and the former is holding the latter hostage, in the current scenario. If AMD succeeds in making Radeon R9 series unappealing to the P2P currency community, it will restore the brand to its original target audience, the gamers.
AMD spent the better part of the past five years in building some impressive game developer relations, who developed and optimized their games for AMD Radeon. The company risks harming those efforts, if it gives in to the Litecoin craze. It may cut some profit by catering to the craze with Radeon R9 in the short term, but those profits will inevitably be at the expense of brand-Radeon in the long term. Time for some hard thinking.
GPUs can inherently handle parallelized workloads better, and it just so happens that investing in AMD Radeon GPUs is profitable for Litecoin miners, because they pay for themselves, and go on to generate revenue, when deployed in a good enough scale. Graphics CoreNext is the only GPU architecture that restores competitiveness of GPUs against ASICs, chips purpose-built to handle peer-to-peer currency validation hash processing.
Litecoin, like Bitcoin, is a peer-to-peer currency. It's decentralized, in that there's no central bank or monetary authority. The U.S. Federal Reserve governs the U.S. Dollar, for example. A triad of HSBC, Standard Chartered, and Bank of China issues the Hong Kong Dollar; Euro by the European Central Bank, and most countries have constitutional bodies set up to handle their currencies. Every transaction using a currency, including buying soda from a vending machine using loose change, is vouched for by its central bank. Litecoin doesn't have that, and so it relies on a distributed computing network to validate each transaction, and the integrity of each unit of the currency, with the wallet it's part of.
Unlike older distributed computing ventures like Folding@Home and WCG, which continue to draw people to donate some of their CPU/GPU time for charitable causes like processing scientific problems; the Bitcoin and Litecoin networks pay people who participate in their computing networks. They're paid in, well, Bitcoins and Litecoins, respectively. The beauty of it all? Not only can you pay for some goods and services with these currencies, but also exchange them for your everyday currency. They convert to U.S. Dollar, and you probably can convert a U.S. Dollar to any other currency on the planet.
The faster you process P2P currency validation loads, the more load is directed toward you, and the more you earn. Performance per Dollar immediately becomes the king here. Litecoin.info compiled an exhaustive list of AMD and NVIDIA GPUs, sorted by their Litecoin Hash processing performance. You'll note how at reference clock-speeds, NVIDIA's GeForce GTX Titan crunches just 320 kH/s (kilo-hash per second), while a Radeon R9 290X, at reference base-clock speeds, yields over three times that, at 980 kH/s. The GeForce GTX 780 Ti in the comparison yields 430 kH/s, but its clock speeds are not mentioned, and so you can't take its numbers at face-value. Even for a moment if we assume that the $650 GTX 780 Ti is running at reference speeds, you can still work out a huge price-performance gap between it and the $550 R9 290X. This, we believe, has led to some North American retailers getting opportunistic, who inflated the retail price of the R9 290X to prices close to $700, and the R9 290 (non-X), close to $500, from the $399 MSRP it launched with.
These hikes in prices of AMD Radeon R9 series products needn't be in reaction to a hike in demand, and retailers have the luxury of assuming that anyone who is buying a Graphics CoreNext-based GPU is doing so for Litecoin/Bitcoin. And so we find the argument that Litecoin mining has caused a shortage in AMD Radeon inventories, which is responsible for the price hike, absolute horse poo. More so, because AMD's chips not just destroy NVIDIA's, but also go up against some purpose-built ASIC boards, on the cost-performance metric.
Yet another reason we believe that the hike in AMD Radeon prices is not a result of inventory shortage, is because of pricing in other markets. Retailers in the EU apparently have healthy inventories of AMD Radeon, or at least the unaffected prices of graphics cards there seem to suggest that, if we were to believe the short-supply argument. We don't think Europeans aren't enamored by P2P currency, or the art of making money, but that European retailers aren't getting cocky about pricing their AMD Radeon inventory to end users, at least not yet. Eventually, bad pricing of AMD Radeon may catch up in Europe.
That brings us to the operational portion of this OP-ED. How the P2P currency craze hurts AMD more than helps it. AMD isn't manufacturing "Hawaii" silicon on a war footing. There are no indications that the company is scaling up supply to the "demand." The inflation in AMD Radeon prices appears to be backed more by the chips' huge price-performance advantage over NVIDIA at P2P currency processing, rather than short-supply. Whoever is into Litecoin processing, is apparently willing to cough up $700 for an R9 290X.
What this does, is it makes AMD Radeon prohibitively expensive for the target market of AMD Radeon, gamers and PC enthusiasts. Price-performance gaps between AMD and NVIDIA are tight and mangled; when it comes to the stuff Radeon and GeForce are actually designed for, to render 3D graphics for games. Fewer gamers and enthusiasts will buy AMD Radeon from here on out, as a result of the Litecoin craze. In the worst case scenario, this could give NVIDIA the opportunity to arbitrarily raise prices of GeForce GTX products slightly, while still maintaining higher price-performance at gaming, if not P2P currency processing.
Here's what AMD could try, to wriggle itself out of this mess - fight fire with fire, and build low-cost, limited edition AMD Stream GPGPU boards based on the Graphics CoreNext architecture, which offer higher price-performance ratios (at P2P currency processing) than even its own Radeon R9 series graphics cards. Those AMD Stream boards could be based on chips that are purpose-built for P2P currency processing loads, don't come with too much memory, and lack unnecessary components, so they could be sold at relatively low prices, and beat Radeon R9 series at price-performance. Again, there are two price-performance ratios at play here, one at P2P currency processing, and one at gaming, and the former is holding the latter hostage, in the current scenario. If AMD succeeds in making Radeon R9 series unappealing to the P2P currency community, it will restore the brand to its original target audience, the gamers.
AMD spent the better part of the past five years in building some impressive game developer relations, who developed and optimized their games for AMD Radeon. The company risks harming those efforts, if it gives in to the Litecoin craze. It may cut some profit by catering to the craze with Radeon R9 in the short term, but those profits will inevitably be at the expense of brand-Radeon in the long term. Time for some hard thinking.
131 Comments on Why the Litecoin Craze Hurts More Than Helps Brand AMD Radeon
On a side note... How do the Fire GPUs preform for coin mining?
Non waffle, they do well
It's 2014 and some of these guys are still using the old 'better drivers' and 'physx' fanboy wars legacy.
[INDENT]The scrypt function is designed to hinder such attempts by raising the resource demands of the algorithm. Specifically, the algorithm is designed to use a large amount of memory compared to other password-based KDFs, making the size and the cost of a hardware implementation much more expensive, and therefore limiting the amount of parallelism an attacker can use (for a given amount of financial resources).
From wili.cc/blog/gpgpu-faceoff.html
"Ambient Occlusion code is a very mixed load of floating point and integer arithmetics, dynamic branching, texture sampling and memory access. Despite the memory traffic, this is a very compute intensive kernel."
[/INDENT]
I know it hurts you to remember, but frame latency was a driver problem. Not only that, but it wasn't a problem AMD figured out either rapidly or on their own. Certainly, however, it was a widely pervasive and longterm problem they had for years before even acknowledging it.
That's to say nothing to the lack of Crossfire support at the 7970 launch or the impact frame latency had on usability of Crossfire for all of 2012 and most of 2013. It took AMD from Jan 2013 until August 2013 to even release a pre-release fix for frame latency and they haven't even finished a fix for anyone running resolutions higher than 1600p for people NOT buying GCN 1.1-based cards (R9 290/290X, 7790-derived cards) despite spending how many years compelling users to "upgrade" to Eyefinity?
I'd say driver issues are still topic of the day until AMD starts having more frequent YEARS of not having massive driver issues.
I do have a bitter taste in my mouth since Eyefinity hasn't worked right since day one, IMHO, and I've owned Crossfire and Eyefinity in all it's incarnations since it launched. It'd be great if it worked perfectly, but it doesn't.
And it is very much a small market, but at the same time, you can't blame me for wanting to get what AMD advertised but didn't deliver, either. 7970 Crossfire Eyefinity is still very much broken, and is why I bought new cards.
I just wonder what will happen when the mining craze slows down, and all these AMD cards get dumped on the used market, barely working since they've been pushed truly hard.
Comment #1 - Why is the price and stock levels of Europe not affected? As someone pointed out, cost of VAT and electricity provide lower payback, while exchange coin to USD and then exchange that to euro probably kills any huge return. Not to say folk in Europe aren't getting cards to do it they probably went second party from the US to get their card.
#2 - Why doesn't AMD ramp up production, as folk say? It takes weeks/months to schedule new wafer starts and then that considers TSMC has openings in production. At best if AMD upped starts at TSMC in mid-Dec, they might have boxed product on the way into the channel like March/April. Then what happens if AMD added production just to find that long before boxed units find their way into the channel Litecoin mining has gone bust. AMD would be giving OEM’s incentives (kickbacks) to help incentivize/move excess product on production they would’ve surely paid some premium to get TSMC to schedule in. There’s not much AMD can do.
One up-side may well be there was a Nvidia gamer that thought, “I’ll buy AMD to mine and then also game!" Face it if you can buy a high-end gaming card, make money, and even sell it for say almost MSRP what’s not to love. Then what if their experience with AMD dispels all the negativity they’ve though of AMD… AMD might win customers! Nice that you probably can sell such 7970's for a pretty penny as they’re desirable for Mining... More than the perhaps $200 each they might have been garnered if mining wasn’t around.
I didn't really think most Hardware "degrades" from being "push hard"; well as long as the environment/cooling had been sufficient.
Well......
Gameplay was choppy, people definitely complained but many played on (since it's not like the bad pacing caused black screens is it ?), some red fanboys even claimed not to notice. It's come to light and being fixed. anything else ? I mean, it's not like the frame latency was caused by faulty cards and led to a class action settlement is it ?
Many games still have frame latency issues, or other problems, since day one, and switching between single-GPU and Crossfire doesn't work right without rebooting, those issues present a large problem for me. I have 429 games in STEAM alone...and about 10% maybe work right in Eyefinity and Crossfire.
So, my PSU blew up. I need to buy a new one. Might as well buy new GPUs at the same time...but only 10% of my games work right with my 7970's....might as well give NVidia a shot and deal with other issues.
Your last part.. Damn good question there.. some unhappy people when they crap out sooner rather than later. I know that will be my question when I see them come up FS...
I use RadeonPro to mitigate the microstutter issues
www.radeonpro.info/2013/03/crossfire-microstutter-afterburner-vs-radeon-pro/
www.tomshardware.com/reviews/crysis-3-performance-benchmark-gaming,3451-6.html
Geforce GTX 690 has it's own microstutter issues.
I did many reviews posted here with GTX670 SLi. I know how each performs. I did have triple 7950's, too.