Wednesday, January 1st 2014

Why the Litecoin Craze Hurts More Than Helps Brand AMD Radeon

Price wars between GPU makers are something to look forward to each year, as that's typically when you get the best bang for your buck. Such an optimal time to buy new graphics cards usually comes when both AMD and NVIDIA have launched a new graphics card lineup, each. AMD tends to launch its lineup first, followed by NVIDIA, which then begins a cycle of proactive and reactive price-cuts between the two, which churns up the $300 price-performance sweet-spot so well, that a purchase from that segment usually sets your PC up for the following three years. 2013-14 saw a major disruption to this cycle, Litecoin mining. Litecoin miners will hurt more than help brand AMD Radeon, here's why.
Litecoin miners are buying up AMD Radeon GPUs. The reason being, that GPUs based on the Graphics CoreNext architecture are exceptionally good at processing P2P currency hash workloads, the only way just about anyone with a computer can create real money by putting their CPU or GPU to work. CPUs simply won't cut it with such processing loads anymore, despite being multi-core and equipped with AES-NI instruction sets.

GPUs can inherently handle parallelized workloads better, and it just so happens that investing in AMD Radeon GPUs is profitable for Litecoin miners, because they pay for themselves, and go on to generate revenue, when deployed in a good enough scale. Graphics CoreNext is the only GPU architecture that restores competitiveness of GPUs against ASICs, chips purpose-built to handle peer-to-peer currency validation hash processing.

Litecoin, like Bitcoin, is a peer-to-peer currency. It's decentralized, in that there's no central bank or monetary authority. The U.S. Federal Reserve governs the U.S. Dollar, for example. A triad of HSBC, Standard Chartered, and Bank of China issues the Hong Kong Dollar; Euro by the European Central Bank, and most countries have constitutional bodies set up to handle their currencies. Every transaction using a currency, including buying soda from a vending machine using loose change, is vouched for by its central bank. Litecoin doesn't have that, and so it relies on a distributed computing network to validate each transaction, and the integrity of each unit of the currency, with the wallet it's part of.

Unlike older distributed computing ventures like Folding@Home and WCG, which continue to draw people to donate some of their CPU/GPU time for charitable causes like processing scientific problems; the Bitcoin and Litecoin networks pay people who participate in their computing networks. They're paid in, well, Bitcoins and Litecoins, respectively. The beauty of it all? Not only can you pay for some goods and services with these currencies, but also exchange them for your everyday currency. They convert to U.S. Dollar, and you probably can convert a U.S. Dollar to any other currency on the planet.

The faster you process P2P currency validation loads, the more load is directed toward you, and the more you earn. Performance per Dollar immediately becomes the king here. Litecoin.info compiled an exhaustive list of AMD and NVIDIA GPUs, sorted by their Litecoin Hash processing performance. You'll note how at reference clock-speeds, NVIDIA's GeForce GTX Titan crunches just 320 kH/s (kilo-hash per second), while a Radeon R9 290X, at reference base-clock speeds, yields over three times that, at 980 kH/s. The GeForce GTX 780 Ti in the comparison yields 430 kH/s, but its clock speeds are not mentioned, and so you can't take its numbers at face-value. Even for a moment if we assume that the $650 GTX 780 Ti is running at reference speeds, you can still work out a huge price-performance gap between it and the $550 R9 290X. This, we believe, has led to some North American retailers getting opportunistic, who inflated the retail price of the R9 290X to prices close to $700, and the R9 290 (non-X), close to $500, from the $399 MSRP it launched with.

These hikes in prices of AMD Radeon R9 series products needn't be in reaction to a hike in demand, and retailers have the luxury of assuming that anyone who is buying a Graphics CoreNext-based GPU is doing so for Litecoin/Bitcoin. And so we find the argument that Litecoin mining has caused a shortage in AMD Radeon inventories, which is responsible for the price hike, absolute horse poo. More so, because AMD's chips not just destroy NVIDIA's, but also go up against some purpose-built ASIC boards, on the cost-performance metric.

Yet another reason we believe that the hike in AMD Radeon prices is not a result of inventory shortage, is because of pricing in other markets. Retailers in the EU apparently have healthy inventories of AMD Radeon, or at least the unaffected prices of graphics cards there seem to suggest that, if we were to believe the short-supply argument. We don't think Europeans aren't enamored by P2P currency, or the art of making money, but that European retailers aren't getting cocky about pricing their AMD Radeon inventory to end users, at least not yet. Eventually, bad pricing of AMD Radeon may catch up in Europe.

That brings us to the operational portion of this OP-ED. How the P2P currency craze hurts AMD more than helps it. AMD isn't manufacturing "Hawaii" silicon on a war footing. There are no indications that the company is scaling up supply to the "demand." The inflation in AMD Radeon prices appears to be backed more by the chips' huge price-performance advantage over NVIDIA at P2P currency processing, rather than short-supply. Whoever is into Litecoin processing, is apparently willing to cough up $700 for an R9 290X.

What this does, is it makes AMD Radeon prohibitively expensive for the target market of AMD Radeon, gamers and PC enthusiasts. Price-performance gaps between AMD and NVIDIA are tight and mangled; when it comes to the stuff Radeon and GeForce are actually designed for, to render 3D graphics for games. Fewer gamers and enthusiasts will buy AMD Radeon from here on out, as a result of the Litecoin craze. In the worst case scenario, this could give NVIDIA the opportunity to arbitrarily raise prices of GeForce GTX products slightly, while still maintaining higher price-performance at gaming, if not P2P currency processing.

Here's what AMD could try, to wriggle itself out of this mess - fight fire with fire, and build low-cost, limited edition AMD Stream GPGPU boards based on the Graphics CoreNext architecture, which offer higher price-performance ratios (at P2P currency processing) than even its own Radeon R9 series graphics cards. Those AMD Stream boards could be based on chips that are purpose-built for P2P currency processing loads, don't come with too much memory, and lack unnecessary components, so they could be sold at relatively low prices, and beat Radeon R9 series at price-performance. Again, there are two price-performance ratios at play here, one at P2P currency processing, and one at gaming, and the former is holding the latter hostage, in the current scenario. If AMD succeeds in making Radeon R9 series unappealing to the P2P currency community, it will restore the brand to its original target audience, the gamers.

AMD spent the better part of the past five years in building some impressive game developer relations, who developed and optimized their games for AMD Radeon. The company risks harming those efforts, if it gives in to the Litecoin craze. It may cut some profit by catering to the craze with Radeon R9 in the short term, but those profits will inevitably be at the expense of brand-Radeon in the long term. Time for some hard thinking.
Add your own comment

131 Comments on Why the Litecoin Craze Hurts More Than Helps Brand AMD Radeon

#51
Steevo
cadaveca
Yes, they do...at $500. Which also buys a GTX780, which performs equally, but has better drivers. I already have a GTX780, so I'm well aware of the differences offered, and while both are very similar, the end experience is too much apart.


There are significant resources in AMD GPUs that don't benefit gaming directly, and compute only. GTX780 proves that easily enough...what it lacks and AMD has...doesn't benefit gaming in any way, really...

AMD has ALWAYS pushed compute, really, they were the first to offer GPU F@H, before NVidia started sponsoring Stanford (who now has a building in s certain somebody's name, making it undeniable), it's just not looked at that AMD has because someone else has had a far larger voice in that market. So AMD knew exactly WHAT they were doing in those designs that sit on some shelves today.
Actually that came down to the 16/24bit VS 32 bit issues Nvidia had to match the performance that ATI had at the time, back when cutting image quality for more FPS by drivers and hardware was a common thing, moreso for Nvidia, but ATI did it too occasionally.
Posted on Reply
#52
theoneandonlymrk
The way Gcn works well with 64bit wave fronts per compute unit is precisely why they do well that and Amd's initial and fervent support of open standards.
Posted on Reply
#53
Ravenas
The prices will eventually deflate so much due to popularity that it won't be worth it.

On a side note... How do the Fire GPUs preform for coin mining?
Posted on Reply
#54
theoneandonlymrk
I think all fire pro cards are tahiti based atm and are essentially the same with mere driver optimisation and no significant difference but it does depend on application , but in single precision the new R9s have a big lead so it depends on precision required in a simulation or algorithm as to which is quickest AFAIK.

Non waffle, they do well
Posted on Reply
#55
Deadlyraver
I am happy there is some light being shed about Bitcoin mining in it's true expense.
Posted on Reply
#56
jigar2speed
theoneandonlymrk
Spot on, and a good question would be.

Why is a guy in hydrabad India moaning about the inflated prices only seen in America, as Bta rightly points out in europe the litecoin markup is non existant as is the shortage.

Bta did an Rma go bad with Amd or something because some of your editorials lately have been ass shaped

hows about an editorial on nvidia or intel eh????
I live in India, you have no idea what you are talking about, the price of AMD radeon card in India are even higher than what they are in USA.
Posted on Reply
#57
Relayer
cadaveca
It's not that they were completely unavailable.. I'm waiting for my 780's to ship still because of holiday modness... it's that the price that the R9 290 is available for that makes the R9 290 uninteresting, because frankly, NVidia's got their software in far better condition than AMD does, and the lack of a price difference to realize the problems that AMD has for users like me makes AMD card unappealing. Simply put, the R9 290, just isn't an option due to price increases...over $500 for what was supposed to be a $400 card. Why pay $500 and have driver problems (Crossfire and Multi-monitor) when you can pay $500 and not have those problems?
I haven't seen any crossfire and multi monitor issues reported with 290's. Have you?
Posted on Reply
#58
Relayer
Assimilator
My only confusion with the whole Litecoin/Radeon availability situation is: why has AMD not ramped card production up to "war-time" levels to account for the demand and hence sell more units? AMD has a great opportunity here and, as usual, they don't seem to be taking advantage of it.
You can't book fab time at TSMC on a weekly basis. Likely contracts are done at least a year in advance.
Posted on Reply
#59
Regenweald
Relayer
I haven't seen any crossfire and multi monitor issues reported with 290's. Have you?
Actually, in the [H] review of 4k gaming with 290s vs 780 Tis (As I'm sure most of us have read), the 290 experience was smooth and near plug and play whereas the Ti required significant effort to get going. So I really was not going to even challenge the 2005 'Driver' argument.

It's 2014 and some of these guys are still using the old 'better drivers' and 'physx' fanboy wars legacy.
Posted on Reply
#60
rvalencia
cadaveca
It's not that they were completely unavailable.. I'm waiting for my 780's to ship still because of holiday modness... it's that the price that the R9 290 is available for that makes the R9 290 uninteresting, because frankly, NVidia's got their software in far better condition than AMD does, and the lack of a price difference to realize the problems that AMD has for users like me makes AMD card unappealing. Simply put, the R9 290, just isn't an option due to price increases...over $500 for what was supposed to be a $400 card. Why pay $500 and have driver problems (Crossfire and Multi-monitor) when you can pay $500 and not have those problems?
What's the problem with R9-290's multi-monitor?
Posted on Reply
#61
rvalencia
JDG1980
Scrypt (Litecoin mining) relies on integer operations, not floating-point. Specifically, AMD GPUs can do integer bit rotations in one operation, while Nvidia cards require three.
There's more to Scrypt than just the integer bit rotations. From http://en.wikipedia.org/wiki/Scrypt

The scrypt function is designed to hinder such attempts by raising the resource demands of the algorithm. Specifically, the algorithm is designed to use a large amount of memory compared to other password-based KDFs, making the size and the cost of a hardware implementation much more expensive, and therefore limiting the amount of parallelism an attacker can use (for a given amount of financial resources).


From http://wili.cc/blog/gpgpu-faceoff.html

"Ambient Occlusion code is a very mixed load of floating point and integer arithmetics, dynamic branching, texture sampling and memory access. Despite the memory traffic, this is a very compute intensive kernel."



Posted on Reply
#62
rvalencia
james888
Who said they had done anything to make their gpu's mine better? Amd just has the right gpu architecture and has been pushing their compute abilities for all gpu's. Not that nvidea is a slouch in compute ability but didn't they cut some of the compute parts off the little kepler cards?
A mix of integer and floating benefits Ambient Occlusion type workloads. Read http://wili.cc/blog/gpgpu-faceoff.html for App 2 bar graph.
Posted on Reply
#63
HisDivineOrder
Regenweald
Actually, in the [H] review of 4k gaming with 290s vs 780 Tis (As I'm sure most of us have read), the 290 experience was smooth and near plug and play whereas the Ti required significant effort to get going. So I really was not going to even challenge the 2005 'Driver' argument.

It's 2014 and some of these guys are still using the old 'better drivers' and 'physx' fanboy wars legacy.
Given that AMD had some serious driver issues as late as 2013 and it WAS 2013 a week ago, I'd say discussing their driver issues or potential for driver issues is not unwarranted or unexpected.

I know it hurts you to remember, but frame latency was a driver problem. Not only that, but it wasn't a problem AMD figured out either rapidly or on their own. Certainly, however, it was a widely pervasive and longterm problem they had for years before even acknowledging it.

That's to say nothing to the lack of Crossfire support at the 7970 launch or the impact frame latency had on usability of Crossfire for all of 2012 and most of 2013. It took AMD from Jan 2013 until August 2013 to even release a pre-release fix for frame latency and they haven't even finished a fix for anyone running resolutions higher than 1600p for people NOT buying GCN 1.1-based cards (R9 290/290X, 7790-derived cards) despite spending how many years compelling users to "upgrade" to Eyefinity?

I'd say driver issues are still topic of the day until AMD starts having more frequent YEARS of not having massive driver issues.
Posted on Reply
#64
cadaveca
My name is Dave
rvalencia
What's the problem with R9-290's multi-monitor?
The issues with dual-GPU and Eyefinity and the 290 cards are still there, unfortunately, although not as prevalent as they were with the original GCN cards.

I do have a bitter taste in my mouth since Eyefinity hasn't worked right since day one, IMHO, and I've owned Crossfire and Eyefinity in all it's incarnations since it launched. It'd be great if it worked perfectly, but it doesn't.
Posted on Reply
#65
EarthDog
Works a lot better though according to most review sites and users. It seemed to have acquiesced all but the harshest of critics anyway... LOL! Besides, those users with multiple monitors and GPUs are less than 1% of users, (actually IIRC steam stats had it at like .3% or some low #). Now it should work for everyone, but, clearly, with such a small market share, this is back burner stuff, especially when a 290X/780ti can run a lot of games decent at 5760x1080.
Posted on Reply
#66
theoneandonlymrk
jigar2speed
I live in India, you have no idea what you are talking about, the price of AMD radeon card in India are even higher than what they are in USA.
Right so its Amd's fault their cards are dear where your from, the importers have nothing to do with it. the price and availability in uk are both spot on
Posted on Reply
#67
cadaveca
My name is Dave
EarthDog
Works a lot better though according to most review sites and users. It seemed to have acquiesced all but the harshest of critics anyway... LOL! Besides, those users with multiple monitors and GPUs are less than 1% of users, (actually IIRC steam stats had it at like .3% or some low #). Now it should work for everyone, but, clearly, with such a small market share, this is back burner stuff, especially when a 290X/780ti can run a lot of games decent at 5760x1080.
Well, at least AMD are moving in the right direction. I'm not surprised it'd be better passing frames via PCIe than via a silly PCiex1 bridge...

And it is very much a small market, but at the same time, you can't blame me for wanting to get what AMD advertised but didn't deliver, either. 7970 Crossfire Eyefinity is still very much broken, and is why I bought new cards.

I just wonder what will happen when the mining craze slows down, and all these AMD cards get dumped on the used market, barely working since they've been pushed truly hard.
Posted on Reply
#68
Casecutter
Great discussion.
Comment #1 - Why is the price and stock levels of Europe not affected? As someone pointed out, cost of VAT and electricity provide lower payback, while exchange coin to USD and then exchange that to euro probably kills any huge return. Not to say folk in Europe aren't getting cards to do it they probably went second party from the US to get their card.

#2 - Why doesn't AMD ramp up production, as folk say? It takes weeks/months to schedule new wafer starts and then that considers TSMC has openings in production. At best if AMD upped starts at TSMC in mid-Dec, they might have boxed product on the way into the channel like March/April. Then what happens if AMD added production just to find that long before boxed units find their way into the channel Litecoin mining has gone bust. AMD would be giving OEM’s incentives (kickbacks) to help incentivize/move excess product on production they would’ve surely paid some premium to get TSMC to schedule in. There’s not much AMD can do.

One up-side may well be there was a Nvidia gamer that thought, “I’ll buy AMD to mine and then also game!" Face it if you can buy a high-end gaming card, make money, and even sell it for say almost MSRP what’s not to love. Then what if their experience with AMD dispels all the negativity they’ve though of AMD… AMD might win customers!


cadaveca
and is why I bought new cards.
barely working since they've been pushed truly hard.
Nice that you probably can sell such 7970's for a pretty penny as they’re desirable for Mining... More than the perhaps $200 each they might have been garnered if mining wasn’t around.

I didn't really think most Hardware "degrades" from being "push hard"; well as long as the environment/cooling had been sufficient.
Posted on Reply
#69
Regenweald
HisDivineOrder
Given that AMD had some serious driver issues as late as 2013 and it WAS 2013 a week ago, I'd say discussing their driver issues or potential for driver issues is not unwarranted or unexpected.

I know it hurts you to remember, but frame latency was a driver problem. Not only that, but it wasn't a problem AMD figured out either rapidly or on their own. Certainly, however, it was a widely pervasive and longterm problem they had for years before even acknowledging it.

That's to say nothing to the lack of Crossfire support at the 7970 launch or the impact frame latency had on usability of Crossfire for all of 2012 and most of 2013. It took AMD from Jan 2013 until August 2013 to even release a pre-release fix for frame latency and they haven't even finished a fix for anyone running resolutions higher than 1600p for people NOT buying GCN 1.1-based cards (R9 290/290X, 7790-derived cards) despite spending how many years compelling users to "upgrade" to Eyefinity?

I'd say driver issues are still topic of the day until AMD starts having more frequent YEARS of not having massive driver issues.
So frame latency then. Sub 1% of customer usage base. I was actually going to mention it in my post, something along the lines of "save for the frame pacing issue" but then I thought 'Is anyone really going to grasp at a niche multi-card issue and umbrella all issues under that last bastion of the fanboy wars?'

Well......

Gameplay was choppy, people definitely complained but many played on (since it's not like the bad pacing caused black screens is it ?), some red fanboys even claimed not to notice. It's come to light and being fixed. anything else ? I mean, it's not like the frame latency was caused by faulty cards and led to a class action settlement is it ?
Posted on Reply
#70
cadaveca
My name is Dave
Regenweald
So frame latency then. Sub 1% of customer usage base. I was actually going to mention it in my post, something along the lines of "save for the frame pacing issue" but then I thought 'Is anyone really going to grasp at a niche multi-card issue and umbrella all issues under that last bastion of the fanboy wars?'

Well......

Gameplay was choppy, people definitely complained but many played on (since it's not like the bad pacing caused black screens is it ?), some red fanboys even claimed not to notice. It's come to light and being fixed. anything else ? I mean, it's not like the frame latency was caused by faulty cards and led to a class action settlement is it ?
Actually, yeah, since I am part of that niche, I have those issues. It IS a valid point of contention that influences my current purchasing, after all. My 7970 Crossfire isn't perfect...so there's no way I'd expect R9 290 Crossfire to NOT have the same problems, since AMD was so adamant about the problems not being hardware-related....

Many games still have frame latency issues, or other problems, since day one, and switching between single-GPU and Crossfire doesn't work right without rebooting, those issues present a large problem for me. I have 429 games in STEAM alone...and about 10% maybe work right in Eyefinity and Crossfire.

So, my PSU blew up. I need to buy a new one. Might as well buy new GPUs at the same time...but only 10% of my games work right with my 7970's....might as well give NVidia a shot and deal with other issues.
Posted on Reply
#71
EarthDog
Its possible, being on different architectures that the results would be different in CFx... But perhaps I have a glass half full approach seeing as how this never affected me as bad. But then again, I have like 15 games in steam of which I play one really (use the others for GPU reviews, LOL!).


cadaveca
Well, at least AMD are moving in the right direction. I'm not surprised it'd be better passing frames via PCIe than via a silly PCiex1 bridge...

And it is very much a small market, but at the same time, you can't blame me for wanting to get what AMD advertised but didn't deliver, either. 7970 Crossfire Eyefinity is still very much broken, and is why I bought new cards.

I just wonder what will happen when the mining craze slows down, and all these AMD cards get dumped on the used market, barely working since they've been pushed truly hard.
Oh no, most certainly I can't blame you for wanting them to get it right!

Your last part.. Damn good question there.. some unhappy people when they crap out sooner rather than later. I know that will be my question when I see them come up FS...
Posted on Reply
#72
rvalencia
cadaveca
The issues with dual-GPU and Eyefinity and the 290 cards are still there, unfortunately, although not as prevalent as they were with the original GCN cards.

I do have a bitter taste in my mouth since Eyefinity hasn't worked right since day one, IMHO, and I've owned Crossfire and Eyefinity in all it's incarnations since it launched. It'd be great if it worked perfectly, but it doesn't.
Besides the lack of 5760x1080p settings in some games, I haven't experienced Eyefinity issues with my old solo 7950 OC and later 7970 OC + 7950 OC config (not listed in my system specs).

I use RadeonPro to mitigate the microstutter issues
http://www.radeonpro.info/2013/03/crossfire-microstutter-afterburner-vs-radeon-pro/
Posted on Reply
#73
rvalencia
cadaveca
Actually, yeah, since I am part of that niche, I have those issues. It IS a valid point of contention that influences my current purchasing, after all. My 7970 Crossfire isn't perfect...so there's no way I'd expect R9 290 Crossfire to NOT have the same problems, since AMD was so adamant about the problems not being hardware-related....

Many games still have frame latency issues, or other problems, since day one, and switching between single-GPU and Crossfire doesn't work right without rebooting, those issues present a large problem for me. I have 429 games in STEAM alone...and about 10% maybe work right in Eyefinity and Crossfire.

So, my PSU blew up. I need to buy a new one. Might as well buy new GPUs at the same time...but only 10% of my games work right with my 7970's....might as well give NVidia a shot and deal with other issues.
From
http://www.tomshardware.com/reviews/crysis-3-performance-benchmark-gaming,3451-6.html

Geforce GTX 690 has it's own microstutter issues.
Posted on Reply
#75
rvalencia
cadaveca
You can't use results from nearly a year ago. Drivers on both sides have matured greatly since then.
My point was AMD doesn't have a monopoly on microstutter issues.
Posted on Reply
Add your own comment