• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Editorial On AMD's Raja Koduri RX Vega Tweetstorm

you mean how many titles can spread cpu load across cores ? less than a handful ? Am I reading this post from a reviewer ?
Sorry.. a handful. 4c/8t is the sweetspot for gaming now, and will be for a couple of years. ;)

And to clarify, I'm talking can REALLY use the cores, like BF1... There are currently only a handful.
 
Last edited:
Has anyone tried power saving bios mode that is then boosted through wattman till artifacts appear? :laugh:
and btw do you think your consistent hitting the power limits has anything to do with that 30% power slider increase ...

Could probably reduce the 30% a bit, but that's what was needed to maintain stable 1085mhz mem clocks in furmark. 10-15% seemed fine games.

Got my Vega 64 in the first week they were sold. Waited 2 weeks more for the pre-ordered EK watercooler. Now its at >1600Mhz all the time and >50% faster than my Fury X before and reaches hardly 50°C. Im satisfied with the increase of gaming performance at 1440p so far. So, tnx AMD, you delivered.

I need to block my 1700X too so I am waiting for the fluid gaming line to update to Vega. Their support says it should be out Oct 1st.
 
When I spend 500+ on a GPU I expect it to be at least tuned good enough to use directly. If I can overclock it then great, icing on the cake.

RX Vega on the hand is already maxed out in terms of thermal profile, power consumption and MHz. How the hell should the end user adjust it to make it an appealing and somewhat OK gaming card. It is one thing to like a brand and it is another to hand out free passes when they simply failed to deliver a good enough gaming GPU for the current market.

On top of all that, RTG locked Vega's BIOS. Meaning there will be no way to implement any actual BIOS based modifications. RX Vega is a failure no matter what way you spin the story.

Also adding onto the "compute" market of Vega. Well good luck with RTG's next to non-existent technical support. In a market already has wide spread adoption of CUDA, it would be pretty fun to see how much RTG can carve out by utilizing their "Open standard Free stuff" strategy.
 
I was hoping for a nice price/performance ratio, much like when the 5 series was introduced. Got the opposite. Maybe next time.
 
I was hoping for a nice price/performance ratio, much like when the 5 series was introduced. Got the opposite. Maybe next time.

No kidding, my old 5870 still serves in my HTPC. 4870~5870 arc AMD had some good GPUs
 
111 watts power usage.

Just under stock 1070 performance for less wattage is still nice.

My overclocked 1070 is faster, cost $100 less even than Vega 64 at retail (which is near impossible to get - they are running $300 more than I paid for my 1070), and only uses 150 watts oc'd at full load. Most importantly I used it for more than a full year before the Vega finally released.

If you under-volted a 1070 for a more fair comparison the power difference would even be less. But saying in that configuration which is really apples and oranges that it uses a meager 39w less is really grasping. At my local electricity rates even running 100% 24x7 it would take almost 5 years to break even on the 39w power savings to make up $100, if you could somehow get really lucky and find one for $499. And apples to apples out of the box the Vega uses far more power.

These price/performance marketing pushes AMD is always trying just seem to be attempting to distract from the real benchmark numbers and availability and pricing, which are what they are. And the benchmark numbers aren't bad, but I have to call BS when they claim lower power usage than NV at stock...(huge asterisk) if they way under-volt the Vega (lol). That's like saying a Ferrari gets good gas mileage, if you don't drive it over 20 mph and only drive it down steep hills.
 
It's really sad to see Raja react in this unprofessional manner.
Vega is pretty useless for gaming.

Also adding onto the "compute" market of Vega. Well good luck with RTG's next to non-existent technical support. In a market already has wide spread adoption of CUDA, it would be pretty fun to see how much RTG can carve out by utilizing their "Open standard Free stuff" strategy.
FYI: The CUDA compiler infrastructure is open source.
 
It's really sad to see Raja react in this unprofessional manner.
Vega is pretty useless for gaming.


FYI: The CUDA compiler infrastructure is open source.

After Nvidia spent a huge amount of money to promote it along with providing great technical support.
 
After Nvidia spent a huge amount of money to promote it along with providing great technical support.
Well, AMD have their "Boltzmann initiative" to add CUDA support, but I've haven't heard much of it lately. It's really up to AMD at this point.
 
He admitted features still aren't enabled, but neglected to mention that basically all of them are still off LOL
 
Sorry.. a handful. 4c/8t is the sweetspot for gaming now, and will be for a couple of years. ;)

And to clarify, I'm talking can REALLY use the cores, like BF1... There are currently only a handful.
agreed. but I'm talking about CPU efficiency not performance, that's IMO the main advantage of using a Ryzen in your gaming rig. While there are usuallly no performance gains from going above 8 threads, there are still huge efficiency benefits of having 12 or 16 threads dividing the work.
1800X using less power than 7700K in games is a proof that with more cores available the load levels across cores/threads gets lower and therefore efficiency levels get higher. 7700K stock draws 30% more power in this chart, and we know 1700 is even better than 1800X in performance/watt due to not utilizing XFR which raises voltage but shows very measly returns.

power_gaming.png
 
Last edited:
To be fair, if you drop the power slider to -50% the perf/watt is actually pretty good. It's just the performance is more like OC 1060/580 territory.
What? In Power Save mode, the Vega64 consumes about 40-50W more power than the 1080, and the performance difference is 5-2% in 1440P and 4K. How would that be 1060 territory? It's midway between 1070 and 1080.

seems overpriced too me .. /pass ... will be getting A 1080 ti in A few months .

Can't get to your point. You say it is overpriced to you. I thought your next sentence will be "I will be getting a 1080 in a few months". First, 1080 Ti costs way more than a 64 not speaking of the 56. Second, prices may change to that date. So you buy a card that is more expensive.... You could have also said that "RX560 (or GTX1050Ti) is overpriced, I will be buying a 1080Ti in a few months".

It's really sad to see Raja react in this unprofessional manner.
Vega is pretty useless for gaming.

What mate? What do you mean by useless? You think gaming only starts at 1080 Ti level or what? Vega56 IS FASTER than the 1070 for the same price and Vega 64 is 1080 performance for the same price with higher power draw. You are totally mistaken.

My overclocked 1070 is faster, cost $100 less even than Vega 64 at retail (which is near impossible to get - they are running $300 more than I paid for my 1070), and only uses 150 watts oc'd at full load.

That's a big bunch of lie and misleading information. First,Vega AIBs will be also faster like the AIB 1070/1080. You can also OC them. Second, MSI 1070 Gaming X consumes 175W compared to the FE's 150W. So let me believe you lie when you say the OCd 1070 is consuming 150W. Third and last, saying you bought 1070 a year ago is useless for people buying a 1070 or Vega56 NOW, as the 1070 price got higher in the previous months despite its price was once lowered by Nvidia earlier. And also, while the MSRP of the 1070 is 20$ cheaper than the Vega56 the fact is that ref Vega56 costs about 40$ less than the cheapest 1070s. This suggests that the AIB Vega56s will cost the same as the AIB 1070s.
 
Last edited:
the one thing I dont get is all this talk about undervolting yet overclocking.
Is that just a fluke of some people? because if that is just totally possible then why does it not ship like that in the first place?
 
What? In Power Save mode, the Vega64 consumes about 40-50W more power than the 1080, and the performance difference is 5-2% in 1440P and 4K. How would that be 1060 territory? It's midway between 1070 and 1080.

Power save is -25%. I was talking about -50%, which scores just under a stock 1070.

Power save uses 165w and scores ~9550 in gpu score for Fire Strike Extreme

-50% uses 112w and scores ~8300
 
Last edited:
agreed. but I'm talking about CPU efficiency not performance, that's IMO the main advantage of using a Ryzen in your gaming rig. While there are usuallly no performance gains from going above 8 threads, there are still huge efficiency benefits of having 12 or 16 threads dividing the work.
1800X using less power than 7700K in games is a proof that with more cores available the load levels across cores/threads gets lower and therefore efficiency levels get higher. 7700K stock draws 30% more power in this chart, and we know 1700 is even better than 1800X in performance/watt due to not utilizing XFR which raises voltage but shows very measly returns.

power_gaming.png
Yep! Count those pennies! :)
 
the one thing I dont get is all this talk about undervolting yet overclocking.
Is that just a fluke of some people? because if that is just totally possible then why does it not ship like that in the first place?
Basically sometimes there is wiggle room to both undervolt and overclock a bit though usually it's limited however you can typically undervolt/underclock the idle and lower clock speeds pstates further to maximize performance and efficiency better.
 
agreed. but I'm talking about CPU efficiency not performance, that's IMO the main advantage of using a Ryzen in your gaming rig. While there are usuallly no performance gains from going above 8 threads, there are still huge efficiency benefits of having 12 or 16 threads dividing the work.
1800X using less power than 7700K in games is a proof that with more cores available the load levels across cores/threads gets lower and therefore efficiency levels get higher. 7700K stock draws 30% more power in this chart, and we know 1700 is even better than 1800X in performance/watt due to not utilizing XFR which raises voltage but shows very measly returns.

power_gaming.png

The GPU is causing most of the power difference. Remember that because the 7700k gets data to the GPU faster, the GPU can spend more time rendering frames. This in turn gives you a higher frame rate and also higher power consumption from the GPU. I was surprised W1zzard didn't pick up on why the power consumption was so much higher :(

@W1zzard What game did you use for this test?
 
Just why would people who just bought Vega do that lol.....Better buy a 1060 then...
:wtf::shadedshu:

I do...

I just bought a vega 64 and it’s running in power saver when I am on vms or when I play old games.. even some new games I am getting decent FPS in 4K.. and some I play at 1440 as I don’t see much change when playing in 4K.. it’s a awesome feature to have than want and don’t have..
 
People hated on Fermi but clearly the lessons learnt then are paying dividends to this very day. The Jack of all trades but master of none GPU makes zero sense, and from Kepler onwards well.... you know.

Some brutal truths from Raja, Vega competes with the GP104 on performance, but it's impossible for them to compete with it on price, with such a large complex chip that also uses HBM2 too, it's no wonder there is talk of them losing money with each card sold.

GP102 just adds insult to injury, meanwhile Nvidia's HBM2 GP100 was never going to be seen in the consumer market anyway, and it's higher costs are easily absorbed by the enormous HPC budgets the card is targeted at.

The reality is AMD achieve wonders with their R&D budgets, but realistically they need to go the same route... if they can afford it.
 
My overclocked 1070 is faster, cost $100 less even than Vega 64 at retail (which is near impossible to get - they are running $300 more than I paid for my 1070), and only uses 150 watts oc'd at full load. Most importantly I used it for more than a full year before the Vega finally released.

I did say just before the part you quoted that it makes a terrible 1060 :) But I think this next quote kinda shows why you MAY want one.

I do...

I just bought a vega 64 and it’s running in power saver when I am on vms or when I play old games.. even some new games I am getting decent FPS in 4K.. and some I play at 1440 as I don’t see much change when playing in 4K.. it’s a awesome feature to have than want and don’t have..

Adding to what he said, since I am running a Ultrawide 3440x1440 Freesync Monitor, Vega is the best I can do. 75hz on Freesync, 60hz on any Nvidia card. That also is worth something, besides the fact that a gsync monitor would have cost me more.

So yeah, perf/$$$ isn't the best, but in the right scenario it's much better to run Vega than Pascal, especially if all your games can run at your refresh rate underclocked so the whole "power-hog" reasoning goes away :)
 
Got my Vega 64 in the first week they were sold. Waited 2 weeks more for the pre-ordered EK watercooler. Now its at >1600Mhz all the time and >50% faster than my Fury X before and reaches hardly 50°C. Im satisfied with the increase of gaming performance at 1440p so far. So, tnx AMD, you delivered.

So how much did that end up costing you?

When I spend 500+ on a GPU I expect it to be at least tuned good enough to use directly. If I can overclock it then great, icing on the cake.

RX Vega on the hand is already maxed out in terms of thermal profile, power consumption and MHz. How the hell should the end user adjust it to make it an appealing and somewhat OK gaming card. It is one thing to like a brand and it is another to hand out free passes when they simply failed to deliver a good enough gaming GPU for the current market.

On top of all that, RTG locked Vega's BIOS. Meaning there will be no way to implement any actual BIOS based modifications. RX Vega is a failure no matter what way you spin the story.

Also adding onto the "compute" market of Vega. Well good luck with RTG's next to non-existent technical support. In a market already has wide spread adoption of CUDA, it would be pretty fun to see how much RTG can carve out by utilizing their "Open standard Free stuff" strategy.

You should have posted $600+, have you seen these for $500?
I was hoping for a nice price/performance ratio, much like when the 5 series was introduced. Got the opposite. Maybe next time.

Like a kid who didn't get what they wanted for Christmas :D
 
What strikes me most about the stuff Raja says in the second screenshot of Twitter stuff, is the underlying 'surprise' he shows at how Vega is performing and how it is received.

- Its not rocket science that 300w+ is a high number for this perf
- He's surprised at what reviewers found (wtf?! does he know his product?)
- Optimized Vega will be coming in the future. Ah, so you've had all this time, and we got a beta version. Sweet

Got my Vega 64 in the first week they were sold. Waited 2 weeks more for the pre-ordered EK watercooler. Now its at >1600Mhz all the time and >50% faster than my Fury X before and reaches hardly 50°C. Im satisfied with the increase of gaming performance at 1440p so far. So, tnx AMD, you delivered.

Run some benches and post them! Looking forward to seeing it
 
First, AMD is always making the argument that they'll bring what you expected... tomorrow. It's always tomorrow. They've got good enough for good enough now, but tomorrow they'll have better than you can possibly imagine. If you'll just wait. Tomorrow, honest. Always tomorrow.

Second, this article seems to make the argument that AMD is insulated against poor performance because games are built for consoles first and foremost. Except that's not proven to be the case in several high profile cases where console-built games that also came to PC showed real problems on AMD cards. Why make an argument already proven false this year? Perpetuating that falsehood doesn't make it true. It's just echo chamber'ing the same failed logic.
 
Between fairly strong DX12 and 4K performance and very strong 3D modeling performance in Blender VEGA represents a good value to me if I could manage to pick one up at their MSRP in the semi near future. However once Nvidia launches a new lineup forget it's too late in all likely hood.
 
Back
Top