• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RX Vega Achieves 43 MH/s @ 130 W in Ethereum Mining

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.16/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
AMD's RX Vega is more along the lines of an original computing card that was moved over to the consumer segment for gaming workloads than the other way around. Raja Koduri himself has said something along those lines (extrapolating a little more than what he can actually say), and that much can be gleaned with at least a modicum of confidence through AMD's market positioning and overall computing push. In the argument between gamers and miners, Raja Koduri didn't have all that much to say, but for AMD, a sale is a sale, and it would seem that after some tweaking, RX Vega graphics cards can achieve much increased levels of mining efficiency than their Polaris counterparts, further showing how Vega handles compute workloads much better - and more efficiently - than traditional gaming ones.





Now granted, Vega's strength in mining tasks - Ethereum in particular - stems mainly from the card's usage of HBM2 memory, as well as a wide architecture with its 4096 stream processors. By setting the core clocks to 1000 MHz, the HBM2 memory clock at 1100 MHz, and power target at -24%, Reddit user S1L3N7_D3A7H was able to leverage Vega's strengths in Ethereum's PoW (Proof of Work) algorithm, achieving 43 MH/s with just 130 W of power (104 W of these for the core alone.) For comparison, tweaked RX 580 graphics cards usually deliver around 30 MH/s with 75 W core power, which amounts to around 115 W power draw per card. So Vega is achieving 43% more hash rate with a meager 13% increase in power consumption - a worthy trade-off if miners have ever seen one. This means that Vega 64 beats RX 580 cards in single node hashrate density, meaning that miners can pack more of these cards in a single system for a denser configuration with much increased performance over a similarly specced RX 580-based mining station. This was even achieved without AMD's special-purpose Beta mining driver, which has seen reports of graphical corruption and instability - the scenario could improve for miners even more with a stable release.



Moreover, S1L3N7_D3A7H said he could probably achieve the same mining efficiency on a Vega 56, which isn't all that unbelievable - memory throughput is king in Ethereum mining, so HBm2 could still be leveraged in that graphics card. It seems that at least some of that initial Vega 64 stock went into some miner's hands, as expected. And with these news, I think we'd be forgiven for holding out to our hats in the expectation of increased Vega stock (at the original $499 for Vega 64 and $399 for Vega 56 MSRP) come October. Should the users' claims about RX Vega 56 efficiency be verified, and coeteris paribus in the mining algorithms landscape for the foreseeable future, then we can very much wait for respectable inventory until Navi enters the scene.

View at TechPowerUp Main Site
 
Last edited:
No real issue with this from a consumer perspective. It would be great if AMD would market their products better so that people can be better informed.

tl;dr
AMD for mining and compute
NVIDIA for gaming
 
Well, at least they'll be able to clear all the inventories to miners then... And "crappy" RX580 will become available again for gamers wanting new modern graphic card for the year 2018...
 
Am I only one that question's the only 130watt draw? Seems bit off, Would like to see testing using proper meters to see if that is what the card is really using.

Software isn't always right on draw.
 
Last edited:
Investment in better Drivers *WASTED*
Investment in Mantle and Vulkan and DX12 *WASTED*
Investment in better Colaboration with Game- and Engine-Makers *WASTED*

Mining YES
 
Investment in better Drivers *WASTED*
Investment in Mantle and Vulkan and DX12 *WASTED*
Investment in better Colaboration with Game- and Engine-Makers *WASTED*

Mining YES

This is the major problem with selling all your cards to miners. Yes, you make the sales, but you're basically screwing up years of relations with gamers, developers and game engine makers. And a graphic card maker without the ecosystem of support and relations with developers is worth pretty much nothing. If they screw this up by neglecting gamers, they can just close the consumer division and just focus on mining nonsense. And when that happens, PC gaming market will be screwed, having only NVIDIA as graphics provider will suck enormously.

Or if they plan on using mining as sustainable income, they really need to do something about the availability of chips. I really hope Navi will pay off and they'll be able to do that. If not, it's gonna suck even more. Cards that are hard to obtain with inflated prices are nothing we ever wanted.
 
Am I only one that question's the only 130watt draw? Seems bit off, Would like to see testing using proper meters to see if that is what the card is really using.

Software isn't always right on draw.

i completely believe it since you can get a FURY X to mine at 85w

dont believe it? under volt these cards youll be shocked
 
Oh this is gonna go down well, not! The miners will keep snatching them up, all bad PR goes to AMD :rolleyes:
 
no surprises here - as I stated long before release when firsts "leaks" appeared and when ams suddenly rebranded their RX "4" to RX "5" (why do AMD need a rebrand their old shyt few months before THE Vega?) - Vega will be inferior GTX 1080 (a 1.5 year old card) in all departments - performance, price/performance and in power draw. AMD did knew that and could not do a thing about that, only hope for AMD was miners. and as I stated before release - if all RX 470- RX 580 are sold out to miners for 400$+, then Vega will be sold out for 800$+. obviously it is pitty for gaming comunity - all that Vulkan vs DX 12, Freesync vs G-sync and competition in 4K gaming hopes are put on ice for long time. Also it is bad for AMD financially - because there will be a flood of second hand cheap cards soon (obviously mining WILL end) - and noone will buy new AMD cards even for discount prices - I see rebranded Vega atmepts after that :(
 
@RejZoR
you hit the nail on the head

GCN is stuck with so many features "Compute OK, Graphic-Processing NOT SO OK"
i lost confidence in any future change giving the needed efficiency and performance
the 1080Ti is 35% faster with 231W average THAN Vega64 Pwr-Safe 214W, what in the world should happen to get there with GCN ?
GCN Tops-out at 4 Geometry-Engines wich is forever 33.33% less than 6 wich Nvidia can build.
The only thing whould be to clock Frontend and Backend 50% higher than Nvidia, wich is very unlikely.
Primitive-Shader can help but "Never bet on a Software-Solution where you can use a Hardware-Solution!"
With a declining Marketshare at the Gaming-Cards no Studio will use Prim-Shader just for AMD
 
AMD seems to have given up on the high-end gaming market. Rx vega is definitely a compute card meant for things like servers, not gaming, that just has less memory than the "real" compute cards.

Let's hope Raj gets fired and the graphics team gets an increased budget so they can make compute cards AND gaming cards!
 
Investment in better Drivers *WASTED*
Investment in Mantle and Vulkan and DX12 *WASTED*
Investment in better Colaboration with Game- and Engine-Makers *WASTED*

Mining YES

THIS!
 
Nvidia hold back Volta because it will have tons of Compute with its 7 TPC per GPC.
The successor of the GTX1080 could have 3584 Cores with 4 GPC.
I whould bet on 5 TPC per GPC again, but with new SM (without TensorCores)
they have plenty of month for tuning now, because Vega failed so hard.
I bet GCN Navi doesn´t reach the 1080Ti either, even if Navi clocks with 2.5GHz.
 
Bye bye AMD Radeon Group...welcome AMD Mining Group :D, I expect next series of Nvidia cards to start at 599$ :D
 
A GPU should be more like a general purpose chip for parallel compute tasks that can be used in any way a user needs it to be. It is really bad what Nvidia is doing with their stripped off GPUs.

I really hope AMD will never ever follow that path. Nvidia went so far with crippling GPUs to the level that they swap shader files at the driver level and custom patch drivers for specific games ( isn't that supposed to be the developers problem to optimize their applications...).

I would love to see GPUs treated more like GP Processors not some specific game renderer.

EDIT:

I mean if I buy GPU I would like it to do the tasks that I throw at. It is not of anyone's business to interfere with what people do with their hardware ( edit, render, mine , game, or even experiment with other stuff).
 
Last edited:
Lol. Some random guy from reddit claims 130W usage based on HWInfo screenshot and behold - it's all over the net!
I've been reading about it yesterday, and even WCFTech [!!!]... just think about it, WCFTech did a follow-up/fact checking on those claims:
WCFTech.com said:
[Update 6:35 PM Sunday, September 3, 2017 ]: We finally got to do some power measurements of our own, using a power meter. Our test bed idles at 138 watts and the entire system consumes an average of about ~385 watts under load while mining. This yields a delta of 248 watts, which is obviously significantly higher than what the redditor claims to have achieved with his system.
In context: they are measuring RX Vega64 with 980mV undervolt at 1130/1100 MHz (vs 1000/1100 @1000mV?).

All things considered, 43.5MH/s is still an impressive result but in this context it is irrelevant. That power consumption number is total fiction of a delusional kid from reddit and until vega finally hits the shelves at promised prices - no one in their right mind is going to buy it. It is still a 6+month for a complete return on investment at MSRP, and "f^&k that" at today's fictional retail price. In terms of perf/W - a pair of undervolted GTX1060 6G's makes more sense and is abundant in stores worldwide.

So, once again, there is no reason for miners to hunt for Vega until the price drops to MSRP, the shelves are stocked, and/or AMD optimizes the crap out of it to run ~60+MH/s.
 
Last edited:
Raevenlord@ please allow me to fix this for you:

"RX Vega Isn't for Gamers"

can be more like :

"RX Vega Isn't for Gaming ONLY!"

or maybe :

"RX Vega Isn't JUST an Nvidia titles render ONLY !"

oh .. wait ! myabe :


"RX Vega Isn't JUST an Nvidia GAME WORKS ! render ONLY !"
ahh .. nvm My language level is not that good in English :lovetpu:
 
Last edited:
I would prefer to see a bit more restraint from TPU news, when we start linking random reddit claims, this is becoming a shitshow faster than you can imagine. Verify, let it bake for half a day, then reconsider if its still news please.

If we want to follow every reddit claim, we log on to reddit, amirite?
 
So why did AMD feature gaming performance in their marketing slides then?
 
A GPU should be more like a general purpose chip for parallel compute tasks that can be used in any way a user needs it to be. It is really bad what Nvidia is doing with their stripped off GPUs.

I really hope AMD will never ever follow that path. Nvidia went so far with crippling GPUs to the level that they swap shader files at the driver level and custom patch drivers for specific games ( isn't that supposed to be the developers problem to optimize their applications...).

I would love to see GPUs treated more like GP Processors not some specific game renderer.

EDIT:

I mean if I buy GPU I would like it to do the tasks that I throw at. It is not of anyone's business to interfere with what people do with their hardware ( edit, render, mine , game, or even experiment with other stuff).

you know what so funny about this? when nvidia start introducing more compute oriented stuff with Fermi they say nvidia are selling useless feature to those that buying the GPU for gaming purpose. now when AMD did this and nvidia make two separate architecture for gaming and compute they say what nvidia is doing is bad (crippling the GPU capabilities). so in the end everything that nvidia do is wrong.
 
you know what so funny about this? when nvidia start introducing more compute oriented stuff with Fermi they say nvidia are selling useless feature to those that buying the GPU for gaming purpose. now when AMD did this and nvidia make two separate architecture for gaming and compute they say what nvidia is doing is bad (crippling the GPU capabilities). so in the end everything that nvidia do is wrong.

Nah people are just cheap and want everything (super efficient gaming performance + all the compute they can get + solid drivers for everything) at the price of a mid range gaming GPU, even though there has been a gaming/pro market since forever.

Meanwhile, Vega Frontier Edition is not in their systems either and they waited out the first Titan to jump on the 780. You get it? :)

These are also the people who complain about Intel not making faster CPUs while there is no competitor in the world that can make them faster.
 
Good news..

Perhaps used RX 580 and RX 480 will flooding the market soon ..
 
Raevenlord@ please allow me to fix this for you:

"RX Vega Isn't for Gamers"

can be more like :

"RX Vega Isn't for Gaming ONLY!"

or maybe :

"RX Vega Isn't JUST an Nvidia titles render ONLY !"

oh .. wait ! myabe :


"RX Vega Isn't JUST an Nvidia GAME WORKS ! render ONLY !"
ahh .. nvm My language level is not that good in English :lovetpu:

No Vega definitely isn't for gamers. AMD used Geforce cards in their Gamescom booth.


rAoMjkF.jpg
B1Yj58X.jpg

4JhgKn8.gif
 
Back
Top