• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX Vega Put Through 3DMark

Joined
Dec 12, 2016
Messages
1,232 (0.46/day)
Did you even read that review?



Stock boost was 2038

Clocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet. I have a question and a comment:

First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.

Second, from the Gigabyte article I linked, all the game benchmark results are performed with the default settings of the Gigabyte card not the overclocked results (we don't know anything about overclocking a Vega).

As shown on the clock profiles page, the clocks range from 1696 to 2038 MHz. I assume they change based on the load and card temperature. The average is around 1982 MHz.
https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/34.html

It might seem that 1924 MHz is very different from the max 2038 MHz because there is a 2 instead of a 1 but the difference is actually quite small (~5.5%). Plus, it is hard to compare a moving clock speed to a fixed clock speed benchmark result. But if we take the average of 1982 MHz on the Gigabyte card (3% above 1924 MHz) and adjust the 3dmark score, you get ~23260. The 1630 MHz Vega received a score of 22330 which is 4% lower. Yes you can probably overclock a GTX 1080 an additional 4% to get above 2 GHz. You also might be able to overclock a Vega 4%. Time will tell.

So again, all I'm saying is that Vega is equivalent to an overclocked (factory or manual) GTX 1080 according to these leaked 3dmark scores.

If we look at power consumption, TP measured 243 Watts peak on the factory overclocked Gigabtye card. Rumors peg the liquid cooled Vega (I'm assuming that is highest score) at 375 Watts. So AMD is getting about the same performance as Nvidia but at 35% higher power. That suuuuuuuuuuuucccccccccccccckkkkkkkkkkkkksss in my book. Others may not care.
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.27/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
Clocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet. I have a question and a comment:

First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.

Second, from the Gigabyte article I linked, all the game benchmark results are performed with the default settings of the Gigabyte card not the overclocked results (we don't know anything about overclocking a Vega).

As shown on the clock profiles page, the clocks range from 1696 to 2038 MHz. I assume they change based on the load and card temperature. The average is around 1982 MHz.
https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/34.html

It might seem that 1924 MHz is very different from the max 2038 MHz because there is a 2 instead of a 1 but the difference is actually quite small (~5.5%). Plus, it is hard to compare a moving clock speed to a fixed clock speed benchmark result. But if we take the average of 1982 MHz on the Gigabyte card (3% above 1924 MHz) and adjust the 3dmark score, you get ~23260. The 1630 MHz Vega received a score of 22330 which is 4% lower. Yes you can probably overclock a GTX 1080 an additional 4% to get above 2 GHz. You also might be able to overclock a Vega 4%. Time will tell.

So again, all I'm saying is that Vega is equivalent to an overclocked (factory or manual) GTX 1080 according to these leaked 3dmark scores.

If we look at power consumption, TP measured 243 Watts peak on the factory overclocked Gigabtye card. Rumors peg the liquid cooled Vega (I'm assuming that is highest score) at 375 Watts. So AMD is getting about the same performance as Nvidia but at 35% higher power. That suuuuuuuuuuuucccccccccccccckkkkkkkkkkkkksss in my book. Others may not care.

And if you look at the post I made earlier as far as more efficient nvidia cards go the 1080ti massively underclocked still beats it.
 
Joined
Feb 12, 2015
Messages
1,104 (0.33/day)
I don't think even a combo of Intel's and nvidias driver team could fix a card that only competes with a 1080ti downclocked on a 55% tdp...that is a lot of ground to make up for.

It's not a matter of having a better driver team, it's a matter of the card just being too complicated period.

But don't underestimate that this is the first arch to use Rapid Packed Math, High Bandwidth Cache, and (supposedly) tiled rasterization all at once. Then throw in the fact that AMD has been on GCN for 6 years straight.
 
Joined
Dec 12, 2016
Messages
1,232 (0.46/day)
And if you look at the post I made earlier as far as more efficient nvidia cards go the 1080ti massively underclocked still beats it.

Oh yes. I quite agree. The TI version puts AMD even more to shame. This is why I hope they price Vega around $500 like the GTX 1080.
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.27/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
It's not a matter of having a better driver team, it's a matter of the card just being too complicated period.

But don't underestimate that this is the first arch to use Rapid Packed Math, High Bandwidth Cache, and (supposedly) tiled rasterization all at once. Then throw in the fact that AMD has been on GCN for 6 years straight.

2 years since the first benchmarks started their rounds. In that time AMD has done what? I strongly doubt their card id anymore advanced than the HPC products on the market.
 
Joined
Sep 17, 2014
Messages
20,906 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Then throw in the fact that AMD has been on GCN for 6 years straight.

That's six years to optimize it incrementally, but AMD was happier adding shaders and going to 512 bit. They started considering incremental refinements with Tonga which was a dud, and we all know the rebrand hell that followed. AMD could have seen this coming for at least 3-4 years, you can't defend that TBR shouldn't have been here long ago already.

Sorry but 'card too difficult' I just don't buy. GCN is a stubborn fucker that wanted to be jack of all trades and AMD has been paying the price for that ever since DX11. They didn't choose HBM because they figured it'd be a nice, cost effective GPU they did it because desperate for board TDP budget and out of good ideas for GCN itself. Hawaii was a writing on the wall, really. The only thing that saved Polaris from being horrible was delta compression > smaller bus and a smaller node.
 
Last edited:
Joined
Dec 21, 2005
Messages
480 (0.07/day)
Location
USA
System Name Eric's Battlestation
Processor Core i7 6700k
Motherboard GIGABYTE G1 Gaming GA-Z170X-Gaming 7
Cooling Fractal Design Celsius S24
Memory Patriot Viper Steel Series DDR4 32GB 3200MHz
Video Card(s) MSI Mech 6750 XT
Storage Samsung 850 EVO 1TB, Crucial MX500 1TB, Intel 660p 2TB
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply EVGA G2-XR 80 Plus Gold 750W
Mouse Steelseries Rival 3
Keyboard Logitech G810
Software Microsoft Windows 10 Home
The timing of this card really couldn't be much worse but the fact is that if AMD can provide nearly 1080 performance for less than the MSRP, it can do well. Doubtless Nvidia won't sit still, so keep on waiting if that's your thing. I for one need a new GPU to replace my super long in the tooth hd6970 in my wife's PC. She'll get my 970 and I'll take Vega with the Freesync monitor I purchased recently.
 
Joined
Feb 12, 2015
Messages
1,104 (0.33/day)
That's six years to optimize it incrementally, but AMD was happier adding shaders and going to 512 bit. They started considering incremental refinements with Tonga which was a dud, and we all know the rebrand hell that followed. AMD could have seen this coming for at least 3-4 years, you can't defend that TBR shouldn't have been here long ago already.

Sorry but 'card too difficult' I just don't buy. GCN is a stubborn fucker that wanted to be jack of all trades and AMD has been paying the price for that ever since DX11. They didn't choose HBM because they figured it'd be a nice, cost effective GPU they did it because desperate for board TDP budget and out of good ideas for GCN itself. Hawaii was a writing on the wall, really. The only thing that saved Polaris from being horrible was delta compression > smaller bus and a smaller node.

What about Hawaii?

Hawaii wiped the floor with Kepler and Grenada even managed to stay competitive with Maxwell.

The timing of this card really couldn't be much worse but the fact is that if AMD can provide nearly 1080 performance for less than the MSRP, it can do well. Doubtless Nvidia won't sit still, so keep on waiting if that's your thing. I for one need a new GPU to replace my super long in the tooth hd6970 in my wife's PC. She'll get my 970 and I'll take Vega with the Freesync monitor I purchased recently.


In my opinion AMD needs to just give up on making a profit on gaming Vega. At this point they need to keep marketshare, and Zen should make them enough money to be fine until Navi.

If Vega is a 300w 1080, it should cost $400 at most.
 
Last edited by a moderator:
Joined
Jul 16, 2014
Messages
8,116 (2.28/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
so when will we see a real review here?
 
Joined
Dec 21, 2005
Messages
480 (0.07/day)
Location
USA
System Name Eric's Battlestation
Processor Core i7 6700k
Motherboard GIGABYTE G1 Gaming GA-Z170X-Gaming 7
Cooling Fractal Design Celsius S24
Memory Patriot Viper Steel Series DDR4 32GB 3200MHz
Video Card(s) MSI Mech 6750 XT
Storage Samsung 850 EVO 1TB, Crucial MX500 1TB, Intel 660p 2TB
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply EVGA G2-XR 80 Plus Gold 750W
Mouse Steelseries Rival 3
Keyboard Logitech G810
Software Microsoft Windows 10 Home
In my opinion AMD needs to just give up on making a profit on gaming Vega. At this point they need to keep marketshare, and Zen should make them enough money to be fine until Navi.

If Vega is a 300w 1080, it should cost $400 at most.

Agreed. Pricing can make or break the product. I think AMD can appreciate gaining market share on the competitors (look at Ryzen and Threadripper). $400-500 would make the card tempting in that performance bracket, even if it heats up your room and adds to the electric bill.
 
Joined
Sep 17, 2014
Messages
20,906 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Agreed. Pricing can make or break the product. I think AMD can appreciate gaining market share on the competitors (look at Ryzen and Threadripper). $400-500 would make the card tempting in that performance bracket, even if it heats up your room and adds to the electric bill.

The bulk of market share is within the RX480 segment, and Polaris was targeting that. Not Vega. The only way you gain market share with enthusiast/high end is by having the halo card actually be the fastest and allow for trickle down, which is everything Vega's not. Trickle down HBM won't ever happen in the foreseeable future.

@Captain_Tom about Hawaii, read my post. It was GCN with 512 bit and more shaders to counter a major architecture inefficiency issue, one that still exists today and is now hitting its absolute ceiling.
 
Joined
Jan 3, 2015
Messages
2,881 (0.85/day)
System Name The beast and the little runt.
Processor Ryzen 5 5600X - Ryzen 9 5950X
Motherboard ASUS ROG STRIX B550-I GAMING - ASUS ROG Crosshair VIII Dark Hero X570
Cooling Noctua NH-L9x65 SE-AM4a - NH-D15 chromax.black with IPPC Industrial 3000 RPM 120/140 MM fans.
Memory G.SKILL TRIDENT Z ROYAL GOLD/SILVER 32 GB (2 x 16 GB and 4 x 8 GB) 3600 MHz CL14-15-15-35 1.45 volts
Video Card(s) GIGABYTE RTX 4060 OC LOW PROFILE - GIGABYTE RTX 4090 GAMING OC
Storage Samsung 980 PRO 1 TB + 2 TB - Samsung 870 EVO 4 TB - 2 x WD RED PRO 16 GB + WD ULTRASTAR 22 TB
Display(s) Asus 27" TUF VG27AQL1A and a Dell 24" for dual setup
Case Phanteks Enthoo 719/LUXE 2 BLACK
Audio Device(s) Onboard on both boards
Power Supply Phanteks Revolt X 1200W
Mouse Logitech G903 Lightspeed Wireless Gaming Mouse
Keyboard Logitech G910 Orion Spectrum
Software WINDOWS 10 PRO 64 BITS on both systems
Benchmark Scores Se more about my 2 in 1 system here: kortlink.dk/2ca4x
I came, I saw and I do not want it.

If this is how rx vega performe and if the rated tdp of 300 or 375 watt i true, i do not want.

Then i am just even more happy i dit not wait for vega and got a 1080 TI.

Besides a hig power use that will effect your electrical bill, there are also other downsides of a high tdp card:

It heats up the room it is in up faster.
Potentially more noise from fans.
Needs to be bigger to have space for a bigger cooler that can handle 300 watt TDP+ if this TDP turn out true off cause.
Can you really live with a card that uses close to up to dobbelt so much power for the same performance another card can deliver at the half power use. I can not.

Nope RX vega dosent impress me much so far.
 
Joined
Jun 6, 2012
Messages
118 (0.03/day)
I know it's disappointing but how did you deduce that it slower than GTX 1080? Most reference 1080 got around 20000 and since this one is reference it should be the other way around.
 
Joined
Dec 21, 2005
Messages
480 (0.07/day)
Location
USA
System Name Eric's Battlestation
Processor Core i7 6700k
Motherboard GIGABYTE G1 Gaming GA-Z170X-Gaming 7
Cooling Fractal Design Celsius S24
Memory Patriot Viper Steel Series DDR4 32GB 3200MHz
Video Card(s) MSI Mech 6750 XT
Storage Samsung 850 EVO 1TB, Crucial MX500 1TB, Intel 660p 2TB
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply EVGA G2-XR 80 Plus Gold 750W
Mouse Steelseries Rival 3
Keyboard Logitech G810
Software Microsoft Windows 10 Home
The bulk of market share is within the RX480 segment, and Polaris was targeting that. Not Vega. The only way you gain market share with enthusiast/high end is by having the halo card actually be the fastest and allow for trickle down, which is everything Vega's not. Trickle down HBM won't ever happen in the foreseeable future.

@Captain_Tom about Hawaii, read my post. It was GCN with 512 bit and more shaders to counter a major architecture inefficiency issue, one that still exists today and is now hitting its absolute ceiling.

There's more than one way to skin a cat, as they say. Anything at or above 1070 is enthusiast/high end and with what we know of Vega it will be at that level at least. At a competitive price point it can gain some share. I'm not saying it's going to flip the market share around in a cycle, but at this point AMD is looking for any gains it can muster right? What's important is that they don't screw this up. I know we have our team red and green, but we should all be rooting for both to do well to keep competition healthy and prices low for all.
 

Nkd

Joined
Sep 15, 2007
Messages
364 (0.06/day)
Well, if this is the XTX air/water... that isn't good. If it is something lower in the product stack...

300/375W vs 180 (gtx 1080) doesn't look good in performance /W. Looks like they will slide in on price: performance ratio and undercut.

gaming vega have a TDP of 275w for the entire board before overclocking. Videocardz had the numbers earlier this month I think.
 
Joined
Dec 15, 2006
Messages
1,703 (0.27/day)
Location
Oshkosh, WI
System Name ChoreBoy
Processor 8700k Delided
Motherboard Gigabyte Z390 Master
Cooling 420mm Custom Loop
Memory CMK16GX4M2B3000C15 2x8GB @ 3000Mhz
Video Card(s) EVGA 1080 SC
Storage 1TB SX8200, 250GB 850 EVO, 250GB Barracuda
Display(s) Pixio PX329 and Dell E228WFP
Case Fractal R6
Audio Device(s) On-Board
Power Supply 1000w Corsair
Software Win 10 Pro
Benchmark Scores A million on everything....
I am sure it will (And it does) dominate at certain professional workloads just like Bulldozer did. However gamers will say "Why didn't they just die shrink Phenom II/Polaris?!"
Bingo! We have a winner! I would have jizzed my pants over an 8 core Phenom @ 4ghz....
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
gaming vega have a TDP of 275w for the entire board before overclocking. Videocardz had the numbers earlier this month I think.
There were 2 values. One for the air cooled and one for water. The water cooled shows 375W there, while air is 285W. Most other sites report 300/375. A cheesy pump or two sure as hell isn't 90W. ;)

https://videocardz.com/amd/radeon-500
 
Joined
Aug 2, 2011
Messages
1,451 (0.31/day)
Processor Ryzen 9 7950X3D
Motherboard MSI X670E MPG Carbon Wifi
Cooling Custom loop, 2x360mm radiator,Lian Li UNI, EK XRes140,EK Velocity2
Memory 2x16GB G.Skill DDR5-6400 @ 6400MHz C32
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra OC Scanner core +750 mem
Storage MP600 2TB,960 EVO 1TB,XPG SX8200 Pro 1TB,Micron 1100 2TB,1.5TB Caviar Green
Display(s) Acer X34S, Acer XB270HU
Case LianLi O11 Dynamic White
Audio Device(s) Logitech G-Pro X Wireless
Power Supply EVGA P3 1200W
Mouse Logitech G502 Lightspeed
Keyboard Logitech G512 Carbon w/ GX Brown
VR HMD HP Reverb G2 (V2)
Software Win 11
Clocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet. I have a question and a comment:

First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.

Second, from the Gigabyte article I linked, all the game benchmark results are performed with the default settings of the Gigabyte card not the overclocked results (we don't know anything about overclocking a Vega).

As shown on the clock profiles page, the clocks range from 1696 to 2038 MHz. I assume they change based on the load and card temperature. The average is around 1982 MHz.
https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/34.html

It might seem that 1924 MHz is very different from the max 2038 MHz because there is a 2 instead of a 1 but the difference is actually quite small (~5.5%). Plus, it is hard to compare a moving clock speed to a fixed clock speed benchmark result. But if we take the average of 1982 MHz on the Gigabyte card (3% above 1924 MHz) and adjust the 3dmark score, you get ~23260. The 1630 MHz Vega received a score of 22330 which is 4% lower. Yes you can probably overclock a GTX 1080 an additional 4% to get above 2 GHz. You also might be able to overclock a Vega 4%. Time will tell.

So again, all I'm saying is that Vega is equivalent to an overclocked (factory or manual) GTX 1080 according to these leaked 3dmark scores.

If we look at power consumption, TP measured 243 Watts peak on the factory overclocked Gigabtye card. Rumors peg the liquid cooled Vega (I'm assuming that is highest score) at 375 Watts. So AMD is getting about the same performance as Nvidia but at 35% higher power. That suuuuuuuuuuuucccccccccccccckkkkkkkkkkkkksss in my book. Others may not care.


The definition of an overclocked card is somewhat now a misnomer as Pascal has launched. Our understanding of overclocking has been changed drastically with the advent of GPU Boost 3.0 (which is on all Pascal based GPUs). ALL 1080s, whether they're Founders' Edition or not, will generally boost up to at least ~1900MHz, that is really what you should look for as the normalized frequency of the cards. It's rare you'll see one that won't boost to that level or higher unless it's starved for air.

What you should be looking for is STOCK for STOCK performance, no matter if the card is "overclocking" itself or not. Compare a Founders' Edition card to a stock RX Vega, which is I'm sure what you will see all the news outlets test once they are able to publish their reviews on RX Vega.
 

OneMoar

There is Always Moar
Joined
Apr 9, 2010
Messages
8,744 (1.71/day)
Location
Rochester area
System Name RPC MK2.5
Processor Ryzen 5800x
Motherboard Gigabyte Aorus Pro V2
Cooling Enermax ETX-T50RGB
Memory CL16 BL2K16G36C16U4RL 3600 1:1 micron e-die
Video Card(s) GIGABYTE RTX 3070 Ti GAMING OC
Storage ADATA SX8200PRO NVME 512GB, Intel 545s 500GBSSD, ADATA SU800 SSD, 3TB Spinner
Display(s) LG Ultra Gear 32 1440p 165hz Dell 1440p 75hz
Case Phanteks P300 /w 300A front panel conversion
Audio Device(s) onboard
Power Supply SeaSonic Focus+ Platinum 750W
Mouse Kone burst Pro
Keyboard EVGA Z15
Software Windows 11 +startisallback
wow slower then a 1080 and nearly double the power consumption
sign me up

a overclocked 1080 draws about 200W

vega draws >350 at thje clocks needed to match a 1080

if you run it at ~1300 to 1400Mhz it _only_ draws 290W and gets its ass beat by the 1070
pretty simple math
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Clocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet. I have a question and a comment:

First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.

Second, from the Gigabyte article I linked, all the game benchmark results are performed with the default settings of the Gigabyte card not the overclocked results (we don't know anything about overclocking a Vega).

As shown on the clock profiles page, the clocks range from 1696 to 2038 MHz. I assume they change based on the load and card temperature. The average is around 1982 MHz.
https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/34.html

It might seem that 1924 MHz is very different from the max 2038 MHz because there is a 2 instead of a 1 but the difference is actually quite small (~5.5%). Plus, it is hard to compare a moving clock speed to a fixed clock speed benchmark result. But if we take the average of 1982 MHz on the Gigabyte card (3% above 1924 MHz) and adjust the 3dmark score, you get ~23260. The 1630 MHz Vega received a score of 22330 which is 4% lower. Yes you can probably overclock a GTX 1080 an additional 4% to get above 2 GHz. You also might be able to overclock a Vega 4%. Time will tell.

So again, all I'm saying is that Vega is equivalent to an overclocked (factory or manual) GTX 1080 according to these leaked 3dmark scores.

If we look at power consumption, TP measured 243 Watts peak on the factory overclocked Gigabtye card. Rumors peg the liquid cooled Vega (I'm assuming that is highest score) at 375 Watts. So AMD is getting about the same performance as Nvidia but at 35% higher power. That suuuuuuuuuuuucccccccccccccckkkkkkkkkkkkksss in my book. Others may not care.
They aren't if you know... as I explained in post #44 to ya. Its a clear picture. :)

1. Cant compare clockspeeds between the two..
2. Correct. It was stock for that specific card.
3. The BOOST clocks vary by temperature, correct. Not so much load unless its a light load and it drops to a different set of clocks lower than the base clock.
4. We get it... just sharing with you the way to properly read GPUz and how Boost works with NVIDIA cards.
5. Again, we get what you are saying, but how you got there wasn't correct. Looks settled now. :)
6. Yes, and that is a factory overclocked 1080. When AIBs get their hands on Vega, its power will use will go up making that 35% value larger. :(
 

OneMoar

There is Always Moar
Joined
Apr 9, 2010
Messages
8,744 (1.71/day)
Location
Rochester area
System Name RPC MK2.5
Processor Ryzen 5800x
Motherboard Gigabyte Aorus Pro V2
Cooling Enermax ETX-T50RGB
Memory CL16 BL2K16G36C16U4RL 3600 1:1 micron e-die
Video Card(s) GIGABYTE RTX 3070 Ti GAMING OC
Storage ADATA SX8200PRO NVME 512GB, Intel 545s 500GBSSD, ADATA SU800 SSD, 3TB Spinner
Display(s) LG Ultra Gear 32 1440p 165hz Dell 1440p 75hz
Case Phanteks P300 /w 300A front panel conversion
Audio Device(s) onboard
Power Supply SeaSonic Focus+ Platinum 750W
Mouse Kone burst Pro
Keyboard EVGA Z15
Software Windows 11 +startisallback
either way you are looking at a 400W card loosing to a 200 watt card
that costs less btw

I don't understand AMD you would think they would give up on GCN its obviously shit at scaling more compute units

time and time again they throw more CU's at the problem and all we get is more power consumption and MEH performance
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.27/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
They aren't if you know... as I explained in post #44 to ya. Its a clear picture. :)

1. Cant compare clockspeeds between the two..
2. Correct. It was stock for that specific card.
3. The BOOST clocks vary by temperature, correct. Not so much load unless its a light load and it drops to a different set of clocks lower than the base clock.
4. We get it... just sharing with you the way to properly read GPUz and how Boost works with NVIDIA cards.
5. Again, we get what you are saying, but how you got there wasn't correct. Looks settled now. :)
6. Yes, and that is a factory overclocked 1080. When AIBs get their hands on Vega, its power will use will go up making that 35% value larger. :(

We actually don't know for sure number 6. Remember how the polaris and Fermi cards pull equal or even sometimes less power than their OEM equal due to temperature drop. Fermi was notorious for pulling less than stock power when heavily overclocked on water. Now I am not saying it will be a miracle 50% decrease, but I could see a drop.
 
Joined
Aug 2, 2011
Messages
1,451 (0.31/day)
Processor Ryzen 9 7950X3D
Motherboard MSI X670E MPG Carbon Wifi
Cooling Custom loop, 2x360mm radiator,Lian Li UNI, EK XRes140,EK Velocity2
Memory 2x16GB G.Skill DDR5-6400 @ 6400MHz C32
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra OC Scanner core +750 mem
Storage MP600 2TB,960 EVO 1TB,XPG SX8200 Pro 1TB,Micron 1100 2TB,1.5TB Caviar Green
Display(s) Acer X34S, Acer XB270HU
Case LianLi O11 Dynamic White
Audio Device(s) Logitech G-Pro X Wireless
Power Supply EVGA P3 1200W
Mouse Logitech G502 Lightspeed
Keyboard Logitech G512 Carbon w/ GX Brown
VR HMD HP Reverb G2 (V2)
Software Win 11
either way you are looking at a 400W card loosing to a 200 watt card
that costs less btw

I don't understand AMD you would think they would give up on GCN its obviously shit at scaling more compute units

time and time again they throw more CU's at the problem and all we get is more power consumption and MEH performance

Vega and previous Fury have the same amount of shaders (4096). From what we've seen of Frontier Edition, the resulting increase in performance is directly relatable to the increase in clock speeds. (Tests have been done with a Frontier Edition at 1050MHz and it's performance matched that of R9 Fury X).
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
We actually don't know for sure number 6. Remember how the polaris and Fermi cards pull equal or even sometimes less power than their OEM equal due to temperature drop. Fermi was notorious for pulling less than stock power when heavily overclocked on water. Now I am not saying it will be a miracle 50% decrease, but I could see a drop.
Yes and No... clearly a 90W difference isn't in a single/dual pumps on the card. So either that watercooler isn't doing as good a job as other water cooled implementations, or the water cooled version is using an arseload more power.
 

OneMoar

There is Always Moar
Joined
Apr 9, 2010
Messages
8,744 (1.71/day)
Location
Rochester area
System Name RPC MK2.5
Processor Ryzen 5800x
Motherboard Gigabyte Aorus Pro V2
Cooling Enermax ETX-T50RGB
Memory CL16 BL2K16G36C16U4RL 3600 1:1 micron e-die
Video Card(s) GIGABYTE RTX 3070 Ti GAMING OC
Storage ADATA SX8200PRO NVME 512GB, Intel 545s 500GBSSD, ADATA SU800 SSD, 3TB Spinner
Display(s) LG Ultra Gear 32 1440p 165hz Dell 1440p 75hz
Case Phanteks P300 /w 300A front panel conversion
Audio Device(s) onboard
Power Supply SeaSonic Focus+ Platinum 750W
Mouse Kone burst Pro
Keyboard EVGA Z15
Software Windows 11 +startisallback
Yes and No... clearly a 90W difference isn't in a single/dual pumps on the card. So either that watercooler isn't doing as good a job as other water cooled implementations, or the water cooled version is using an arseload more power.
we know from various benches of the FE what happens when the card goes beyond 1500Mhz the power consumption skyrockets
we also know that AMD is using some ultra aggressive clock-gating to keep the card from drawing 400 fking watts
I leave the rest to your imagination
 
Top