• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RX Vega Achieves 43 MH/s @ 130 W in Ethereum Mining

Joined
Mar 24, 2012
Messages
528 (0.12/day)
GP100 is slower than Vega64 in compute. Only way GP100 comes out on top is in memory intensive tasks thanks to the 4096-bit bus.

GP100 is only slightly slower than Vega (10Tflops vs 13Tflops). but for pure compute chip GP100 still better than Vega since GP100 FP64 was rated at 1/2 of it's FP32 while for Vega was rated at 1/16 (5Tflops vs 0.8Tflops). and we still have to count how efficient the hardware is to extract it's raw performance.
 
Joined
Sep 17, 2014
Messages
20,776 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Would you say that current prices and availability are for gamers? That it's worth it to pay $800 for a Vega 64, or $600+ for a Vega 56, solely for gaming?

If you were able to purchase Vega 56 at MSRP, great. That's a pretty good gaming card, you'll be extremely satisfied - and you have objective reasons for being so.

If you purchased any Vega graphics card above MSRP, you may still feel great about your purchase, but objectively, it's almost certain that the comparable NVIDIA alternative is better in pure gaming/price/performance/power terms.

Meanwhile, if you're mining, you're actually tapping into Vega's potential and strengths, which sadly, and I would love to be wrong, isn't reflected in its gaming prowess.

That's why these aren't for gamers right now. Your mileage may vary with personal opinion, your favorite manufacturer, sure. But objectively, in a technical review, price/performance/power consumption graph like you see here at TPU, that doesn't stand.

Still, I will side with most people here who feel titles like these are on the edge of clickbait-material, or just a bit over it. I don't mind a bit of sarcasm and fun in the news posts, I really like it actually, but its a VERY fine line, and this one took it too far. I do still believe people are entitled to form their own opinion without being pushed from the onset towards a specific one.

A good contrast: in earlier articles, you used the same sort of tone of voice but ended the article with a genuine question towards the opposite. Much better that way because it opens up the debate instead of steering it.
 

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.35/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
But that also doesnt make them NOT for gaming, an inflated price.

A valid argument can be made it isnt for mining either as the ROI at its current pricing doesnt balance out with some other cards either.

See how that works... for both sides?


It does work for both sides, but sale prices impact miners less than they do gamers. For a gamer, paying $800 for a Vega 64 that brings the exact same gaming experience (within 2%) as a $520 GTX 1080 is much, much worse than for a miner, which can recoup those $ in other ways other than gaming (which the gamer almost certainly never will.)

Yes, they have lowered ROI than some other cards - but if it's still profitable for them, they'll do it, especially with these undervolts and overclocks that bring Vega's compute power to the table. even more so now; and you also have to take into account pure performance density considerations, since a single Vega is (arguably) more interesting than a pair of RX 580's, simply because you can get a single vega for 2x RX 580's, thus achieving a smaller system footprint with the same - or almost equivalent - mining power.

That's why the argument works better for one side than the other, and that's why I insist it's a much better fit for miners than gamers.

I don't mind a bit of sarcasm and fun in the news posts, I really like it actually, but its a VERY fine line, and this one took it too far. I do still believe people are entitled to form their own opinion without being pushed from the onset towards a specific one.

A good contrast: in earlier articles, you used the same sort of tone of voice but ended the article with a genuine question towards the opposite. Much better that way because it opens up the debate instead of steering it.

That's an excellent point, one that I can side with. While I don't feel it's clickbait, I admit that I force my own view on the matter somewhat forcefully, and immediately, with that title. And while my original interpretation of the title didn't see it that way, I understand perfectly why readers might.

As such, I will remove that excessive fat from the title, and leave this here for users to see how the change occurred.
 
Last edited:
Joined
Mar 24, 2012
Messages
528 (0.12/day)
People whinge about AMD abandoning gamers.

Gamers abandoned AMD a long time ago, even when its cards were faster and cheaper.

AMD's strategy will make sense in the long term - full precision shaders are unnecessary for 90% of game tasks, so the shift to 16 bit shaders will be a boon both to compute centric tasks, and gaming on those tasks.

Until then (if your one of the 30% that buys a card that's faster than a 580), either you have other shit to do than just game, or really like AMD tech no matter what.

is that so? if that's the case AMD will not gain market share when they have 6 month lead with 5870 (they even beat nvidia in market share back then). they also gain market share when they have 3 months lead on 7970. if AMD want people to buy their GPU they need the fastest single GPU crown for themselves. it was that simple.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
It does work for both sides, but sale prices impact miners less than they do gamers. For a gamer, paying $800 for a Vega 64 that brings the exact same gaming experience (within 2%) as a $520 GTX 1080 is much, much worse than for a miner, which can recoup those $ in other ways other than gaming (which the gamer almost certainly never will.)

Yes, they have lowered ROI than some other cards - but if it's still profitable for them, they'll do it, especially with these undervolts and overclocks that bring Vega's compute power to the table. even more so now; and you also have to take into account pure performance density considerations, since a single Vega is (arguably) more interesting than a pair of RX 580's, simply because you can get a single vega for 2x RX 580's, thus achieving a smaller system footprint with the same - or almost equivalent - mining power.

That's why the argument works better for one side than the other, and that's why I insist it's a much better fit for miners than gamers.

Better, doesnt mean good, nor does it mean it isnt a gaming card either.
 
Joined
Aug 6, 2017
Messages
7,412 (3.05/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
At least Vega is gonna have better resale value than GTX. Unless the cryptocurrency mining market plummets. Then Vega's resale value is going to go down dramatically.
 
Joined
Jul 18, 2007
Messages
2,693 (0.44/day)
System Name panda
Processor 6700k
Motherboard sabertooth s
Cooling raystorm block<black ice stealth 240 rad<ek dcc 18w 140 xres
Memory 32gb ripjaw v
Video Card(s) 290x gamer<ntzx g10<antec 920
Storage 950 pro 250gb boot 850 evo pr0n
Display(s) QX2710LED@110hz lg 27ud68p
Case 540 Air
Audio Device(s) nope
Power Supply 750w superflower
Mouse g502
Keyboard shine 3 with grey, black and red caps
Software win 10
Benchmark Scores http://hwbot.org/user/marsey99/
Would you say that current prices and availability are for gamers? That it's worth it to pay $800 for a Vega 64, or $600+ for a Vega 56, solely for gaming?

If you were able to purchase Vega 56 at MSRP, great. That's a pretty good gaming card, you'll be extremely satisfied - and you have objective reasons for being so.

If you purchased any Vega graphics card above MSRP, you may still feel great about your purchase, but objectively, it's almost certain that the comparable NVIDIA alternative is better in pure gaming/price/performance/power terms.

Meanwhile, if you're mining, you're actually tapping into Vega's potential and strengths, which sadly, and I would love to be wrong, isn't reflected in its gaming prowess.

That's why these aren't for gamers right now. Your mileage may vary with personal opinion, your favorite manufacturer, sure. But objectively, in a technical review, price/performance/power consumption graph like you see here at TPU, that doesn't stand.

i guess it really comes down to who you class in the "for gamers" bracket. most gamers are not spending, even, vega 56 (msrp) money as the majority of gamers buy the more entry level cards. last i was looking the 1050 was the gamers choice as it was vastly out selling all other gpu. it's us enthusiast who pay for the cards higher up the tree and are currently unable to get these cards for whatever purpose we want to use them for.


while i am not disputing the facts in the piece, i just feel the title is a large part of the reason that people are already arguing in this thread. disputes which spill out across the whole forums and do not make it the friendly place it once was :(
 

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.35/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
while i am not disputing the facts in the piece, i just feel the title is a large part of the reason that people are already arguing in this thread. disputes which spill out across the whole forums and do not make it the friendly place it once was :(

I agree with that, hence my removal of the offending words from the title =)

I've made my arguments here in the comment section, and I still believe that currently Vega isn't a good option for gamers/enthusiast gamers. However, the way it was conveyed wasn't the correct one, and the one I want to have here on our site, so I prefer to take a step back instead of plowing through users' expectations.
 

silentbogo

Moderator
Staff member
Joined
Nov 20, 2013
Messages
5,470 (1.45/day)
Location
Kyiv, Ukraine
System Name WS#1337
Processor Ryzen 7 3800X
Motherboard ASUS X570-PLUS TUF Gaming
Cooling Xigmatek Scylla 240mm AIO
Memory 4x8GB Samsung DDR4 ECC UDIMM
Video Card(s) Inno3D RTX 3070 Ti iChill
Storage ADATA Legend 2TB + ADATA SX8200 Pro 1TB
Display(s) Samsung U24E590D (4K/UHD)
Case ghetto CM Cosmos RC-1000
Audio Device(s) ALC1220
Power Supply SeaSonic SSR-550FX (80+ GOLD)
Mouse Logitech G603
Keyboard Modecom Volcano Blade (Kailh choc LP)
VR HMD Google dreamview headset(aka fancy cardboard)
Software Windows 11, Ubuntu 20.04 LTS
It does work for both sides, but sale prices impact miners less than they do gamers. For a gamer, paying $800 for a Vega 64 that brings the exact same gaming experience (within 2%) as a $520 GTX 1080 is much, much worse than for a miner, which can recoup those $ in other ways other than gaming (which the gamer almost certainly never will.)
The price tag impacts everyone, but I don't believe it makes it in any way a non-gaming product. I can't remember the time when "overpriced" was a barrier for PC enthusiasts. People overpay by crazy amount for everything, from constantly degenerating gaming mice to overengineered VRMs and ridiculously massive cooling solutions, from LEDs and "Limited Edition" color options to factory overclock. Even for brand names....

In case of miners - it's all about the price. Even if Vega is the best possible mining card on the market, it still does not make sense to opt for a 7-8 months ROI, when less powerful options can give you profit in 4-5 months even with current inflated prices.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
You guys need an editor who checks your articles before they publish. Too many times things needed to be changed and recanted.. I love the informed takes, a lot better than it was with other newsies in the past, but still...that title was.....
 
Joined
Aug 6, 2009
Messages
1,162 (0.22/day)
Location
Chicago, Illinois
Would you say that current prices and availability are for gamers? That it's worth it to pay $800 for a Vega 64, or $600+ for a Vega 56, solely for gaming?

If you were able to purchase Vega 56 at MSRP, great. That's a pretty good gaming card, you'll be extremely satisfied - and you have objective reasons for being so.

If you purchased any Vega graphics card above MSRP, you may still feel great about your purchase, but objectively, it's almost certain that the comparable NVIDIA alternative is better in pure gaming/price/performance/power terms.

Meanwhile, if you're mining, you're actually tapping into Vega's potential and strengths, which sadly, and I would love to be wrong, isn't reflected in its gaming prowess.

That's why these aren't for gamers right now. Your mileage may vary with personal opinion, your favorite manufacturer, sure. But objectively, in a technical review, price/performance/power consumption graph like you see here at TPU, that doesn't stand.


I'm in total agreement. However, these days you need to be a bit more specific to prevent any counter argument since people nitpick every little thing. Vega is not for gamers in the sense that GAMERS can CURRENTLY get a lot more performance for the money out of just about any Nvidia card. Should Vega pricing get back to MSRP it's a lot more appealing, like you said 56 is actually decent, 64 still isn't good in my opinion, not unless the price comes down, power consumption producing heat being my main reason.
 

idx

Joined
Apr 2, 2009
Messages
98 (0.02/day)
https://www.nvidia.com/content/PDF/...DIA_Fermi_Compute_Architecture_Whitepaper.pdf

Efforts to exploit the GPU for non-graphical applications have been underway since 2003. By using high-level shading languages such as DirectX, OpenGL and Cg, various data parallelalgorithms have been ported to the GPU. Problems such as protein folding, stock options pricing, SQL queries, and MRI reconstruction achieved remarkable performance speedups on the GPU. These early efforts that used graphics APIs for general purpose computing were known as GPGPU programs.
While the GPGPU model demonstrated great speedups, itfaced several drawbacks. First, itrequired the programmer to possess intimate knowledge of graphics APIs and GPUarchitecture. Second, problems had to be expressed in terms of vertex coordinates, texturesand shader programs, greatly increasing program complexity. Third, basic programming features such as random reads and writes to memory were not supported, greatly restricting the programming model. Lastly, the lack of double precision support (until recently) meantsome scientific applications could not be run on the GPU.

To address these problems, NVIDIA introduced two key technologies—the G80 unified graphics and compute architecture (first introduced in GeForce 8800®, Quadro FX 5600®, andTesla C870®GPUs), and CUDA, a software and hardware architecture that enabled the GPU to be programmed with a variety of high level programming languages. Together, these two technologies represented a new way of using the GPU. Instead of programming dedicated graphics units with graphics APIs, the programmer could now write C programs with CUDA extensions and target a general purpose, massively parallel processor. We called this new way
of GPU programming “GPU Computing”—it signified broader application support, widerprogramming language support, and a clear separation from the early “GPGPU” model ofprogramming.

So GPGPU is dead, long live GPU-Computing ... (addition for the archives)

OpenCL is a GPGPU api , and since opengl 4.3 there is a shader stage called Compute Shader so any developer can throw some GP tasks on that stage of rendering .
Vulkan is also going to merge with OpenCL for GPGPU applications. (sadly all nvidia GPUs support only OpenCL 1.2 while Intel and AMD already support 2.1 and 2.2 on the way).

Khronos stated during SIGGRAPH 2017 that Vulkan is going to be not just a graphics api, Vulkan is a GPU API .
 

TheMailMan78

Big Member
Joined
Jun 3, 2007
Messages
22,599 (3.68/day)
Location
'Merica. The Great SOUTH!
System Name TheMailbox 5.0 / The Mailbox 4.5
Processor RYZEN 1700X / Intel i7 2600k @ 4.2GHz
Motherboard Fatal1ty X370 Gaming K4 / Gigabyte Z77X-UP5 TH Intel LGA 1155
Cooling MasterLiquid PRO 280 / Scythe Katana 4
Memory ADATA RGB 16GB DDR4 2666 16-16-16-39 / G.SKILL Sniper Series 16GB DDR3 1866: 9-9-9-24
Video Card(s) MSI 1080 "Duke" with 8Gb of RAM. Boost Clock 1847 MHz / ASUS 780ti
Storage 256Gb M4 SSD / 128Gb Agelity 4 SSD , 500Gb WD (7200)
Display(s) LG 29" Class 21:9 UltraWide® IPS LED Monitor 2560 x 1080 / Dell 27"
Case Cooler Master MASTERBOX 5t / Cooler Master 922 HAF
Audio Device(s) Realtek ALC1220 Audio Codec / SupremeFX X-Fi with Bose Companion 2 speakers.
Power Supply Seasonic FOCUS Plus Series SSR-750PX 750W Platinum / SeaSonic X Series X650 Gold
Mouse SteelSeries Sensei (RAW) / Logitech G5
Keyboard Razer BlackWidow / Logitech (Unknown)
Software Windows 10 Pro (64-bit)
Benchmark Scores Benching is for bitches.
No real issue with this from a consumer perspective. It would be great if AMD would market their products better so that people can be better informed.

tl;dr
AMD for mining and compute
NVIDIA for gaming
I would just be happy if AMD didn't lie about its pricing. I used to be an AMD/ATI fan. I don't take kindly to liars. How can we be better informed when even the price they tell reviewers is a blatant lie.
 
Joined
Mar 10, 2010
Messages
11,878 (2.31/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
It does work for both sides, but sale prices impact miners less than they do gamers. For a gamer, paying $800 for a Vega 64 that brings the exact same gaming experience (within 2%) as a $520 GTX 1080 is much, much worse than for a miner, which can recoup those $ in other ways other than gaming (which the gamer almost certainly never will.)

Yes, they have lowered ROI than some other cards - but if it's still profitable for them, they'll do it, especially with these undervolts and overclocks that bring Vega's compute power to the table. even more so now; and you also have to take into account pure performance density considerations, since a single Vega is (arguably) more interesting than a pair of RX 580's, simply because you can get a single vega for 2x RX 580's, thus achieving a smaller system footprint with the same - or almost equivalent - mining power.

That's why the argument works better for one side than the other, and that's why I insist it's a much better fit for miners than gamers.



That's an excellent point, one that I can side with. While I don't feel it's clickbait, I admit that I force my own view on the matter somewhat forcefully, and immediately, with that title. And while my original interpretation of the title didn't see it that way, I understand perfectly why readers might.

As such, I will remove that excessive fat from the title, and leave this here for users to see how the change occurred.
I have a waterblocked modded rx480(580modded timings++bios) and a waterblocked rx vega 64 in the same rig with enough cooling for them and a bit more and I cannot hit those values with that wattage , the memory downclocks if under powered so you have to push the slider to plus 20 to maintain that memory speed even though its undervolted and downclocked ,cards differ but id suggest hes reading the wrong value as does wccf in a rare change from their normal hyperbole.
Wccf ran their own tests that suggest that reddit users wrong as is the software he's using , hwinfo64.
I tried the latest hwinfo64 build and reported hbm memory watts drawn dropped from 100-165watts to never above 30 between builds.


And there in is the pertinent point , 90% of software does not work at all with vega regarding tuneing and monitoring software , so anyone saying anything finite about what vega can or can't do is probably missinformed including many reviewers and their overclocks , a power clamp is the only sure way I've seen of knowing vegas power draw .....

Thats a full stop comment , every software lies.
All of them are unreliable yet people are basing a lot of stuff and comments of of it, or less, word of mouth.
 
Last edited:
Joined
Feb 2, 2015
Messages
2,707 (0.81/day)
Location
On The Highway To Hell \m/
Seriously!? This is 1000000000000000000000000000000000000000000000000% unsubstantiated BULLSHIT! It CANNOT be and HAS NOT been replicated or confirmed by anyone and until it is it SHOULD NOT be believed for half a second.

And YES, someone already tried to replicate and confirm the results. And came NOWHERE CLOSE!!!
http://wccftech.com/amd-rx-vega-64-...m-eclipsing-polaris-efficiency-factor-2x/amp/

Proving it's total BS!!!

Let me quote myself for clarity on the matter:
Read the article carefully. They basically proved it was impossible with the amount of watts reportedly consumed.

1 card + 248W = 43.8 MH/s
2 cards + 248W = 87 MH/s

Do the math. It doesn't add up. First, half of 248 is 124(not ~130). Second, how can 2 cards achieve pretty much exactly double the hash rate of 1 card while only consuming the exact same amount of watts as 1 card? Answer...they can't. It's BS.
https://www.techpowerup.com/forums/threads/so-long-vega.236740/
 
Last edited:

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,866 (3.00/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Instead of taking your ball and going home with a snarky comment, how about you respond with why you feel that isnt true...its how forums should work.
I was just responding in kind, did you properly read what I said? If people think I "don't have an answer" (I do) then I don't really care either. Make a polite post to me and I'll happily discuss it. Snarky and I won't bother. Simple.

I see that a friendly and reasoned reply from Vayra, I think, has disapeared, most likely because posted during the database transition. I never had a chance to read it properly unfortunately, but I would have been happy to reply to it.
 
Joined
Feb 18, 2017
Messages
688 (0.27/day)
Nvidia hold back Volta because it will have tons of Compute with its 7 TPC per GPC.
The successor of the GTX1080 could have 3584 Cores with 4 GPC.
I whould bet on 5 TPC per GPC again, but with new SM (without TensorCores)
they have plenty of month for tuning now, because Vega failed so hard.
I bet GCN Navi doesn´t reach the 1080Ti either, even if Navi clocks with 2.5GHz.
What?
The Vega reached the 980Ti (1070) and even the 1080. Why are you making fake conclusions?
 
Joined
May 12, 2015
Messages
78 (0.02/day)
Location
Michigan
Processor AMD 5900X
Motherboard AsRock B550 Phantom Gaming-ITX/ax
Cooling Swiftech Apogee Drive 2
Memory GSkill Trident Z F4-4266C17D-32GTZRB
Video Card(s) Nvidia 3800FE (Byski WB)
Storage WD Black SN750 NVME
Display(s) LG 34UC88
Case NCASE M1 (1st edition)
Power Supply Corsair SM600
Mouse Mionix NAOS 7000
Keyboard Corsair K70 Lux
Well I have a Vega 64 I just installed (had it for a couple of weeks, but waiting on my EK block). My freesync monitor kept me with AMD

I tossed it in my NCase build and the blower can barely keep up in the ITX case...FYI in case others were curious. Anyway, my rig idles around 60-70 watts. Using the same settings this person did, I'm pulling about 255 watts and the same 42-43 mh/s. Doing the math that puts the card around 180-190 watts. Definitely not 130 watts.

Attaching a super quick screenshot
 

Attachments

  • Untitled.jpg
    Untitled.jpg
    311.6 KB · Views: 405
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
I was just responding in kind, did you properly read what I said? If people think I "don't have an answer" (I do) then I don't really care either. Make a polite post to me and I'll happily discuss it. Snarky and I won't bother. Simple.

I see that a friendly and reasoned reply from Vayra, I think, has disapeared, most likely because posted during the database transition. I never had a chance to read it properly unfortunately, but I would have been happy to reply to it.
lol....

Dear qubit, can you please kindly respond to the person with whatever information you have so as to clear up the misconceptions...?

Kthx. :)
 

Chloefile

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
10,877 (2.64/day)
Location
Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X
Motherboard Gigabyte B550M Aorus Elite
Cooling Custom loop (CPU+GPU, 240 & 120 rads)
Memory 32GB Kingston HyperX Fury @ DDR4-3466
Video Card(s) PowerColor RX 6700 XT Fighter
Storage ~4TB SSD + 6TB HDD
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Fractal Design Define Mini C
Audio Device(s) Asus TUF H3 Wireless
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 Legendary
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis remastered at 4K
This toy money crazyness needs to stop. I want a card with ok price for gaming, not a mid-end card with insane price, since toy money miners buy all the cards.

Luckily I have lots of monopoly money.
 
Joined
Sep 26, 2012
Messages
856 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling Noctua NH-D15S, 7 x Noctua NF-A14 industrialPPC IP67 2000RPM
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 Strix
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Alienware AW3821DW, Wacom Cintiq Pro 15
Case Fractal Design Torrent
Audio Device(s) Topping A90/D90 MQA, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply ASUS THOR 1600T
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + OpenSUSE MicroOS
is that so? if that's the case AMD will not gain market share when they have 6 month lead with 5870 (they even beat nvidia in market share back then). they also gain market share when they have 3 months lead on 7970. if AMD want people to buy their GPU they need the fastest single GPU crown for themselves. it was that simple.

Nvidia still outsold AMD that half. Its the closest AMD ever came to taking a lead in marketshare, but even against the 2 series, AMD couldn't outsell Nvidia. Which is sad, because the 5xxx series were awesome.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,866 (3.00/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
lol....

Dear qubit, can you please kindly respond to the person with whatever information you have so as to clear up the misconceptions...?

Kthx. :)
That's some glorious sarcasm. :p

You didn't get my point though. ;)
 
Joined
Jul 13, 2016
Messages
2,792 (0.99/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Lol. Some random guy from reddit claims 130W usage based on HWInfo screenshot and behold - it's all over the net!
I've been reading about it yesterday, and even WCFTech [!!!]... just think about it, WCFTech did a follow-up/fact checking on those claims:

In context: they are measuring RX Vega64 with 980mV undervolt at 1130/1100 MHz (vs 1000/1100 @1000mV?).

All things considered, 43.5MH/s is still an impressive result but in this context it is irrelevant. That power consumption number is total fiction of a delusional kid from reddit and until vega finally hits the shelves at promised prices - no one in their right mind is going to buy it. It is still a 6+month for a complete return on investment at MSRP, and "f^&k that" at today's fictional retail price. In terms of perf/W - a pair of undervolted GTX1060 6G's makes more sense and is abundant in stores worldwide.

So, once again, there is no reason for miners to hunt for Vega until the price drops to MSRP, the shelves are stocked, and/or AMD optimizes the crap out of it to run ~60+MH/s.

There must be allot of delusional people then, because it's been sold out since launch. FYI Vega may not be an awesome gaming card but it is still very good at professional work and mining.

":at today's fictional retail price. In terms of perf/W - a pair of undervolted GTX1060 6G's makes more sense and is abundant in stores worldwide."

Read the article, density is important to miners. In addition, GTX 1060 6GB cards are running about $300 right now.
 
Top