Wednesday, January 1st 2014

Why the Litecoin Craze Hurts More Than Helps Brand AMD Radeon

Price wars between GPU makers are something to look forward to each year, as that's typically when you get the best bang for your buck. Such an optimal time to buy new graphics cards usually comes when both AMD and NVIDIA have launched a new graphics card lineup, each. AMD tends to launch its lineup first, followed by NVIDIA, which then begins a cycle of proactive and reactive price-cuts between the two, which churns up the $300 price-performance sweet-spot so well, that a purchase from that segment usually sets your PC up for the following three years. 2013-14 saw a major disruption to this cycle, Litecoin mining. Litecoin miners will hurt more than help brand AMD Radeon, here's why.
Litecoin miners are buying up AMD Radeon GPUs. The reason being, that GPUs based on the Graphics CoreNext architecture are exceptionally good at processing P2P currency hash workloads, the only way just about anyone with a computer can create real money by putting their CPU or GPU to work. CPUs simply won't cut it with such processing loads anymore, despite being multi-core and equipped with AES-NI instruction sets.

GPUs can inherently handle parallelized workloads better, and it just so happens that investing in AMD Radeon GPUs is profitable for Litecoin miners, because they pay for themselves, and go on to generate revenue, when deployed in a good enough scale. Graphics CoreNext is the only GPU architecture that restores competitiveness of GPUs against ASICs, chips purpose-built to handle peer-to-peer currency validation hash processing.

Litecoin, like Bitcoin, is a peer-to-peer currency. It's decentralized, in that there's no central bank or monetary authority. The U.S. Federal Reserve governs the U.S. Dollar, for example. A triad of HSBC, Standard Chartered, and Bank of China issues the Hong Kong Dollar; Euro by the European Central Bank, and most countries have constitutional bodies set up to handle their currencies. Every transaction using a currency, including buying soda from a vending machine using loose change, is vouched for by its central bank. Litecoin doesn't have that, and so it relies on a distributed computing network to validate each transaction, and the integrity of each unit of the currency, with the wallet it's part of.

Unlike older distributed computing ventures like Folding@Home and WCG, which continue to draw people to donate some of their CPU/GPU time for charitable causes like processing scientific problems; the Bitcoin and Litecoin networks pay people who participate in their computing networks. They're paid in, well, Bitcoins and Litecoins, respectively. The beauty of it all? Not only can you pay for some goods and services with these currencies, but also exchange them for your everyday currency. They convert to U.S. Dollar, and you probably can convert a U.S. Dollar to any other currency on the planet.

The faster you process P2P currency validation loads, the more load is directed toward you, and the more you earn. Performance per Dollar immediately becomes the king here. Litecoin.info compiled an exhaustive list of AMD and NVIDIA GPUs, sorted by their Litecoin Hash processing performance. You'll note how at reference clock-speeds, NVIDIA's GeForce GTX Titan crunches just 320 kH/s (kilo-hash per second), while a Radeon R9 290X, at reference base-clock speeds, yields over three times that, at 980 kH/s. The GeForce GTX 780 Ti in the comparison yields 430 kH/s, but its clock speeds are not mentioned, and so you can't take its numbers at face-value. Even for a moment if we assume that the $650 GTX 780 Ti is running at reference speeds, you can still work out a huge price-performance gap between it and the $550 R9 290X. This, we believe, has led to some North American retailers getting opportunistic, who inflated the retail price of the R9 290X to prices close to $700, and the R9 290 (non-X), close to $500, from the $399 MSRP it launched with.

These hikes in prices of AMD Radeon R9 series products needn't be in reaction to a hike in demand, and retailers have the luxury of assuming that anyone who is buying a Graphics CoreNext-based GPU is doing so for Litecoin/Bitcoin. And so we find the argument that Litecoin mining has caused a shortage in AMD Radeon inventories, which is responsible for the price hike, absolute horse poo. More so, because AMD's chips not just destroy NVIDIA's, but also go up against some purpose-built ASIC boards, on the cost-performance metric.

Yet another reason we believe that the hike in AMD Radeon prices is not a result of inventory shortage, is because of pricing in other markets. Retailers in the EU apparently have healthy inventories of AMD Radeon, or at least the unaffected prices of graphics cards there seem to suggest that, if we were to believe the short-supply argument. We don't think Europeans aren't enamored by P2P currency, or the art of making money, but that European retailers aren't getting cocky about pricing their AMD Radeon inventory to end users, at least not yet. Eventually, bad pricing of AMD Radeon may catch up in Europe.

That brings us to the operational portion of this OP-ED. How the P2P currency craze hurts AMD more than helps it. AMD isn't manufacturing "Hawaii" silicon on a war footing. There are no indications that the company is scaling up supply to the "demand." The inflation in AMD Radeon prices appears to be backed more by the chips' huge price-performance advantage over NVIDIA at P2P currency processing, rather than short-supply. Whoever is into Litecoin processing, is apparently willing to cough up $700 for an R9 290X.

What this does, is it makes AMD Radeon prohibitively expensive for the target market of AMD Radeon, gamers and PC enthusiasts. Price-performance gaps between AMD and NVIDIA are tight and mangled; when it comes to the stuff Radeon and GeForce are actually designed for, to render 3D graphics for games. Fewer gamers and enthusiasts will buy AMD Radeon from here on out, as a result of the Litecoin craze. In the worst case scenario, this could give NVIDIA the opportunity to arbitrarily raise prices of GeForce GTX products slightly, while still maintaining higher price-performance at gaming, if not P2P currency processing.

Here's what AMD could try, to wriggle itself out of this mess - fight fire with fire, and build low-cost, limited edition AMD Stream GPGPU boards based on the Graphics CoreNext architecture, which offer higher price-performance ratios (at P2P currency processing) than even its own Radeon R9 series graphics cards. Those AMD Stream boards could be based on chips that are purpose-built for P2P currency processing loads, don't come with too much memory, and lack unnecessary components, so they could be sold at relatively low prices, and beat Radeon R9 series at price-performance. Again, there are two price-performance ratios at play here, one at P2P currency processing, and one at gaming, and the former is holding the latter hostage, in the current scenario. If AMD succeeds in making Radeon R9 series unappealing to the P2P currency community, it will restore the brand to its original target audience, the gamers.

AMD spent the better part of the past five years in building some impressive game developer relations, who developed and optimized their games for AMD Radeon. The company risks harming those efforts, if it gives in to the Litecoin craze. It may cut some profit by catering to the craze with Radeon R9 in the short term, but those profits will inevitably be at the expense of brand-Radeon in the long term. Time for some hard thinking.
Add your own comment

131 Comments on Why the Litecoin Craze Hurts More Than Helps Brand AMD Radeon

#76
ValenOne
cadavecaWell, at least AMD are moving in the right direction. I'm not surprised it'd be better passing frames via PCIe than via a silly PCiex1 bridge...

And it is very much a small market, but at the same time, you can't blame me for wanting to get what AMD advertised but didn't deliver, either. 7970 Crossfire Eyefinity is still very much broken, and is why I bought new cards.

I just wonder what will happen when the mining craze slows down, and all these AMD cards get dumped on the used market, barely working since they've been pushed truly hard.
Your post is hypocritical when NVIDIA has non-gaming heavy workloads for their CUDA based GPUs. Digital coin mining is not first workload that wants 24 hours/7 days operation e.g. Fold @ Home.

With NVIDIA dominating GpGPU in the past years, the main article is hypocritical i.e. one rule for NVIDIA and a different rule for AMD.
Posted on Reply
#77
cadaveca
My name is Dave
rvalenciaMy point was AMD doesn't have a monopoly on microstutter issues.
Nah, of course not. But after having had both SLi and Crossfire and doing many builds over many different platforms with each...posting those results here...what NVidia provides with multi-GPU is better for me and my dollar.

If I could actually get the R9 290 @ the MRSP, then that's what I would have bought.
rvalenciaYour post is hypocritical when NVIDIA has non-gaming workloads for their CUDA based GPUs. Digital coin mining is not first workload that wants 24 hour/7 days operation e.g. Fold @ Home.

With NVIDIA dominating GpGPU, the main article is hypocritical i.e. one rule for NVIDIA and a different rule for AMD.
NVidia dominating is purely based on marketing. AMD(ATI) has ALWAYS delivered better HARDWARE solutions for GPGPU, IMHO. They just suck in the software department.


From my perspective, really, AMD has always been the trend-setter in this market, and NVidia comes in after and capitalizes on what AMD provides.
Posted on Reply
#78
ValenOne
theoneandonlymrkThe way Gcn works well with 64bit wave fronts per compute unit is precisely why they do well that and Amd's initial and fervent support of open standards.
No, the wavefront is larger than 64bit payload i.e. there's four SIMD payload within one wavefront. SIMD's FMA operand format is 3.
Posted on Reply
#79
ValenOne
cadavecaNah, of course not. But after having had both SLi and Crossfire and doing many builds over many different platforms with each...posting those results here...what NVidia provides with multi-GPU is better for me and my dollar.

If I could actually get the R9 290 @ the MRSP, then that's what I would have bought.



NVidia dominating is purely based on marketing. AMD(ATI) has ALWAYS delivered better HARDWARE solutions for GPGPU, IMHO. They just suck in the software department.


From my perspective, really, AMD has always been the trend-setter in this market, and NVidia comes in after and capitalizes on what AMD provides.
Is it purely based on marketing when NVIDIA has more GpGPU installations in HPC when compared to AMD GpGPU?

From www.dailyfinance.com/2014/01/03/can-amd-win-the-high-end/
"DigiTimes estimates the move could increase AMD's share of the professional GPU market to 30%, taking away share from NVIDIA"


----------

No with "AMD(ATI) has ALWAYS delivered better HARDWARE solutions for GPGPU" e.g. VLIW5 based Radeon HD is not suited for complex GpGPU workloads and there are reasons why AMD dumped VLIW5 and VLIW4 Radeon HD designs.

If you read http://wili.cc/blog/gpgpu-faceoff.html each GPU designs offers different advantages.

Notice "App 2" (Ambient Occlusion)'s integer arithmetics advantage component relates to AMD GCN's mining coin results.
Posted on Reply
#80
HammerON
The Watchful Moderator
rvalencia - please stop double and triple posting. Use the "Edit" and "Reply" features when wanting to add to a post you have already made.
Posted on Reply
#81
Cool Vibrations
EarthDogBut AMD has done nothing DIRECTLY to be able to mine more efficiently than NVIDIA... It is the mining software/structure and the architecture of the hardware that makes it more efficient. I would be willing to bet money that how AMD works with mining they had nothing to do with as in, they made their architecture more efficient specifically for mining.

I disagree also that AMD doesn't generate 'noise'. I mean, to be fair, that is all they are in marketing is hype. Look at the Rx XXX releases. The vast majority of that press conference had to do with TrueAudio. So, major hype with no content (still marketing regardless of the outcome). Perhaps a positive outcome of their marketing is what you meant?
Wrong.
DiabloMiner author here. Nvidia keeps claiming they produce a product for GPGPU compute, yet they keep failing on integer performance. Bitcoin is not the only use of integers out there, and its not even limited to crypto research either.

There is zero reason for Nvidia to have made this fundamental mistake generation after generation. This is why I don't support their product, its too slow to be useful and Nvidia doesn't seem to care. I repeatedly tried to reach out to that company, and I never got a response.
Source:www.extremetech.com/computing/153467-amd-destroys-nvidia-bitcoin-mining#comment-865242130

Where are your sources? It's people like you that have made me stop supporting Nvidia. They have too loyalists that are completely clueless. Keep buying their overpriced products though.
Posted on Reply
#82
Bow
Cool VibrationsWhere are your sources? It's people like you that have made me stop supporting Nvidia. They have too loyalists that are completely clueless. Keep buying their overpriced products though.
And that's all I have to say about that.:toast:
Posted on Reply
#83
Vario
The sad thing is bitcoining ruins the cards. So don't expect any useable cards on the second hand market. My second hand 7970 came from a miner and it isn't stable at stock speeds even (crashes idle and sometimes load) and I am warrantying it as a result :(. Really glad they honor second hand cards. The PO didn't tell me it was used for mining... I found that out later.
Posted on Reply
#84
Nordic
Every bitcoin miner plans to sell his gpu's when he is done with them and generally the idea is to keep them sellable. I know a lot of miners try to keep their gpu's below 80c. I myself aimed for no more than 75c. It does depend on each miner though.
Posted on Reply
#85
The Von Matrices
VarioThe sad thing is bitcoining ruins the cards. So don't expect any useable cards on the second hand market. My second hand 7970 came from a miner and it isn't stable at stock speeds even (crashes idle and sometimes load) and I am warrantying it as a result :(. Really glad they honor second hand cards. The PO didn't tell me it was used for mining... I found that out later.
How did you determine the damage was caused by mining rather than any other number of ways, even something as simple as being dropped in transit? Blaming damage on GPU mining seems like a convenient excuse to avoid analyzing a complicated problem with multiple potential causes. In addition, there are always fraudulent sellers passing off broken products as working; GPU mining doesn't change that.
Posted on Reply
#86
Peter1986C
Some might OC the VRAM a touch too hard though, since Scrypt-based hashing seems to benefit from VRAM overclocking (at least my card seems to benefit from that while running Feathercoin (which is a temporary experiment of mine that I will soon quit b/c of lack of profitability)).
Posted on Reply
#87
Vario
The Von MatricesHow did you determine the damage was caused by mining rather than any other number of ways, even something as simple as being dropped in transit? Blaming damage on GPU mining seems like a convenient excuse to avoid analyzing a complicated problem with multiple potential causes. In addition, there are always fraudulent sellers passing off broken products as working; GPU mining doesn't change that.
Because its the only thing that makes sense. The card is physically damage free and looks brand new. No dust at all. Its really bendy though like the guy threw three of them in a case and didnt support them and they heated up and warped. He was selling 5 at the time as working. My 7850 is totally 100% stable. The 7970 destabilizes the system in seconds. I don't want to go into the long of it but I tested every component and multiple reformats and its the GPU. Its been sent to gigabyte, passed the stress test because their engineer didn't read my note about it crashing at idle if you let it sit. Now I have to send it back.

I am running stock bios and no overclock with latest drivers. Its the bitcoining. I saw the ebay seller had extensive litecoin feedback after I bought it. Blah. I tried bios flashing, even a 7970 ghz bios to see if it would raise the voltages for stability but no luck. The rev 2.1 locks it down so even adding volts is impossible to shore up the stock clock. I get artifacts at stock even. Temperatures don't even crack 65*C too.

Also I think its either the ram or the vrm because the memory won't tolerate even a 10mhz overclock (zero headroom isn't normal) in catalyst or it might be something to do with vrm not maintaining idle voltages?
Posted on Reply
#88
Peter1986C
I am not quite an expert in this, but it does sound to me that it is the VRM(s) on the card not powering it right. What voltage readings do Afterburner or GPU-Z report?
Posted on Reply
#89
de.das.dude
Pro Indian Modder
i want to get into litecoin but i cant find the knowledge lol
Posted on Reply
#90
Vario
Chevalr1cI am not quite an expert in this, but it does sound to me that it is the VRM(s) on the card not powering it right. What voltage readings do Afterburner or GPU-Z report?
I gotta reinstall it to find out, might be awhile lol
Posted on Reply
#91
bpgt64
I really wanted to get a pair of r9 290s with the sapphire tech or power color coolers but since the prices jumped I wound up getting a evga titan for 600 from eBay instead of selling my titan..
Posted on Reply
#92
Regenweald
bpgt64I really wanted to get a pair of r9 290s with the sapphire tech or power color coolers but since the prices jumped I wound up getting a evga titan for 600 from eBay instead of selling my titan..
And now you get to run all games on stratosphere settings. That's a litecoin win bro.
Posted on Reply
#93
Relayer
HisDivineOrderGiven that AMD had some serious driver issues as late as 2013 and it WAS 2013 a week ago, I'd say discussing their driver issues or potential for driver issues is not unwarranted or unexpected.

I know it hurts you to remember, but frame latency was a driver problem. Not only that, but it wasn't a problem AMD figured out either rapidly or on their own. Certainly, however, it was a widely pervasive and longterm problem they had for years before even acknowledging it.

That's to say nothing to the lack of Crossfire support at the 7970 launch or the impact frame latency had on usability of Crossfire for all of 2012 and most of 2013. It took AMD from Jan 2013 until August 2013 to even release a pre-release fix for frame latency and they haven't even finished a fix for anyone running resolutions higher than 1600p for people NOT buying GCN 1.1-based cards (R9 290/290X, 7790-derived cards) despite spending how many years compelling users to "upgrade" to Eyefinity?

I'd say driver issues are still topic of the day until AMD starts having more frequent YEARS of not having massive driver issues.
So, I'll ask you. Do you know of any issues with Hawaii and Crossfire/Eyefinity? That's specifically what was being claimed. Most of us are aware of Tahiti's issues, but I've seen nothing that shows Hawaii has similar issues. Simply painting Hawaii with Tahiti's brush is being disingenuous.

As far as it being a driver issue, I'd say since Hawaii and Tahiti use the same drivers, it's likely a hardware problem that's had a Band-aid applied via drivers (Supposed to be out this month.).
Posted on Reply
#94
Relayer
cadavecaWell, at least AMD are moving in the right direction. I'm not surprised it'd be better passing frames via PCIe than via a silly PCiex1 bridge...

And it is very much a small market, but at the same time, you can't blame me for wanting to get what AMD advertised but didn't deliver, either. 7970 Crossfire Eyefinity is still very much broken, and is why I bought new cards.

I just wonder what will happen when the mining craze slows down, and all these AMD cards get dumped on the used market, barely working since they've been pushed truly hard.
Where are you getting this from? Documentation of any type, please.
Posted on Reply
#95
Octopuss
Barely working? First time I see a card "barely works". It either does or is broken and does not, there's nothing inbetween.
Also first time I see a claim a card degrades over time by intense usage.
Posted on Reply
#96
Vario
OctopussBarely working? First time I see a card "barely works". It either does or is broken and does not, there's nothing inbetween.
Also first time I see a claim a card degrades over time by intense usage.
Theres plenty of times things barely work. Heres an example: the card is overclocked and tweaked to run bitcoin with maximum hash. Then the user bios flashes it back and sells it on ebay. It is now unstable even at stock settings and I have to underclock it to get it to work.

When I was younger I owned a lot of things that degraded really bad from heat. I had a Pentium4 inside of a 2004 Shuttle barebones mini pc that ran really hot (100*C) all the time because the heatpipes in the shuttle case would shift the thermal paste when the case was moved. This Pentium 4 never worked right after that on later full sized mobos. Heat ruins electronics. Remember the XBox RROD? Mining generates a lot of heat when you put 4 cards next to each other.



Here is a typical mining setup:


Now I know why my PCB sags so bad.

Heres an inhouse example:
GC_PaNzerFINVery rare HW pr0n now.

Dead Tahiti XTL naked - core shattered like glass, probably during heat expansion during heavy load (mining litecoin). o_O
www.techpowerup.com/forums/threads/sexy-hardware-close-up-pic-clubhouse.71955/page-306#post-3043802
Posted on Reply
#97
Octopuss
LOL what the actual fuck?!

I can see what you mean, partially at least though.
Guess I assumed people with brains, which apparently applies in homeopatic number of cases only :D
Posted on Reply
#98
Vario
OctopussLOL what the actual fuck?!

I can see what you mean, partially at least though.
Guess I assumed people with brains, which apparently applies in homeopatic number of cases only :D
Yeah theres a lot of people who don't have brains in it, thats why its such a bubble. I don't know how anyone could ever statistically model a futures price for bitcoin because its purely based on past price increases rather than anything else. Thats a bubble. "yeah man its gonna go up because its been going up, time to buy in!" It has a really high cost of carry (electricity bill) and it has inherent value only in that it allows pedo's and druggies to buy illegal stuff.

Anyway, it ruins cards. Try running furmark 24/7 for a year and see if your card works. Same deal.
Posted on Reply
#99
captianpicard
As a gamer/hobbyist miner, IMO there is no reason to buy nvidia at this point. Why buy a nvidia card when I can run my radeons when I'm not gaming and essentially have my cards pay for themselves after only 1 or 2 months? I enjoyed nvidia products for very many years but I am not going to turn down what is essentially a top tier, free, crossfire setup. The litecoin difficulty rose as more and more AMD cards were added to the network but seems to have stabilized as the supply of cards dried up. Litecoin mining is still profitable assuming you pay a reasonably low rate for electricity. Besides, the money nowadays is not really in bitcoin or even litecoin anymore, but other scrypt based alt coins such as dogecoin/gridcoin where the profits are potentially enormous for the early miners.

I have been an avid nvidia follower for most of my gaming career but things have shifted with the litecoin craze. An HD 7950 that sold for <$200 last year now sells for 1.5x that, used. So a gamer who bought a 7950 or 7970 in the past year actually saw the value of their card RISE over time, something that I never could have imagined. In terms of performance, all of my AMD cards overclock extremely well, far better than anything I've been able to do with my nvidia cards. They also perform well for all the games I've tried and the drivers are not nearly as bad as people make them out to be.
Posted on Reply
#100
GC_PaNzerFIN
I have never seen a typical load that is so close to Furmark than mining crypto currency. And people tend to do it 24/7. Heat output and temperatures are high compared to any typical gaming load. Also VRMs go easily above 100 degrees celcius which is something people often forget. Fans need to run high RPM to keep cards within acceptable operating temperatures. Indeed, temperature has pretty serious effect on failure rate of CMOS devices.

Here is example with certain memory IC. Quite a big impact going from 90c to 60c, isn't it?


Also integrated circuits do degrade slowly over time. Excess heat and voltage etc. only make it worse. These is ton of articles in web for those interested to read.

This one for example, you could try wikipedia too:
www.semicon.panasonic.co.jp/en/aboutus/pdf/t04007be-3.pdf

PS. Every proper integrated circuit designer / electrical engineer knows this. If you are not one of them, maybe should check facts before making too much noise?
Posted on Reply
Add your own comment
Apr 25th, 2024 00:01 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts