• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Editorial On AMD's Raja Koduri RX Vega Tweetstorm

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.18/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
In what is usually described as a tweetstorm, AMD's RTG leader Raja Koduri weighed in on AMD's RX Vega reception and perception from both the public and reviewers. There are some interesting tidbits there; namely, AMD's option of setting the RX vega parts at frequencies and voltages outside the optimal curve for power/performance ratios, in a bid to increase attractiveness towards the performance/$ crowds.

However, it can be said that if AMD had done otherwise, neither gamers nor reviewers would have been impressed with cards that potentially delivered less performance than their NVIDIA counterparts, while consuming more power all the same (even if consuming significantly less wattage). At the rated MSRP (and that's a whole new discussion), this RTG decision was the best one towards increasing attractiveness of RX Vega offerings. However, Raja Koduri does stress Vega's dynamic performance/watt ratios, due to the usage of specially defined power profiles.



To our forum-walkers: this piece is marked as an editorial



Raja also touched an interesting subject; namely, that "Folks doing perf/mm2 comparisons need to account for Vega10 features competing with 3 different competitive SOCs (GP100, GP102 and GP104)". This is interesting, and one of the points we have stressed here on TPU: AMD's Vega architecture makes great strides towards being the architecture AMD needs for its server/AI/computing needs, but it's a far cry from what AMD gamers deserve. it's always a thin line to be threaded by chip designers on which features and workloads to implement/improve upon with new architectures. AMD, due to its lack of a high-performance graphics chip, was in dire need of a solution that addressed both the high-performance gaming market and the server/computing market where the highest profits are to be made.



This round, the company decided to focus its developments more towards the server side of the equation - as Raja Koduri himself puts it relative to Infinity Fabric: "Infinity fabric on Vega is optimized for server. It's a very scalable fabric and you will see consumer optimized versions of it in future." Likely, this is Raja's way of telling us to expect MCMs (Multi-Chip-Modules) supported by AMD's Infinity Fabric, much as we see with Ryzen. This design philosophy allows companies to design smaller, more scalable and cheaper chips, which are more resistant to yield issues, and effectively improve every metric, from performance, to power and yield, compared to a monolithic design. NVIDIA themselves have said that MCM chip design is the future, and this little tidbit by Raja could be a pointer towards AMD Radeon's future in that department.



That Infinity Fabric is optimized for servers is a reflection of the entire Vega architecture; AMD's RX Vega may be gaming-oriented graphics cards, but most of the architecture improvements aren't capable of being tapped by already-existent gaming workloads. AMD tooled its architecture for a professional/computing push, in a bid to fight NVIDIA's ever-increasing entrenchment in that market segment, and it shows on the architecture. Does this equate to disappointing (current) gaming performance? It does. And it equates to especially disappointing price/performance ratios with the current supply/pricing situation, which the company still hasn't clearly and forcefully defined. Not that they can, anyway - it's not like AMD can point the finger towards the distributors and retailers that carry their products with no repercussion, now can they?



Raja Koduri also addressed the current mining/gamer arguments, but in a slightly deflective way: in truth, every sale counts as a sale for the company in terms of profit (and/or loss, since rumors abound that AMD is losing up to $100 with every RX Vega graphics card that is being sold at MSRP). An argument can be made that AMD's retreating market share on the PC graphics card market would be a never-ending cycle of lesser developer attention towards architectures that aren't that prominent in their user bases; however, I think that AMD considers they're relatively insulated from that particular issue due to the fact that the company's graphics solutions are embedded into the PS4, PS4 Pro, Xbox One S consoles, as well as the upcoming Xbox One X. Developers code for AMD hardware from the beginning in most games; AMD can certainly put some faith in that as being a factor to tide their GPU performance over until they have the time to properly refine an architecture that shines at both computing and gaming environments. Perhaps when we see AMD's Navi, bolstered by AMD's increased R&D budget due to a (hopefully) successful wager in the professional and computing markets, will we see an AMD that can bring the competition to NVIDIA as well as they have done for Intel.

View at TechPowerUp Main Site
 
seems overpriced too me .. /pass ... will be getting A 1080 ti in A few months .
 
Did you hear @W1zzard, you simply didn't showcase the perf/watt dynamic range of Vega well :roll:
It's not that the Vega is a power hog, you simply used wrong benchmarks ... you know, the ones with wrong dynamic range, meaning the ones that have 100% GPU usage all the time :roll:

I would love not to be sarcastic sometimes, but Raja won't let me :shadedshu:
 
Did you hear @W1zzard, you simply didn't showcase the perf/watt dynamic range of Vega well :roll:
It's not that the Vega is a power hog, you simply used wrong benchmarks ... you know, the ones with wrong dynamic range, meaning the ones that have 100% GPU usage all the time :roll:

I would love not to be sarcastic sometimes, but Raja won't let me :shadedshu:

To be fair, if you drop the power slider to -50% the perf/watt is actually pretty good. It's just the performance is more like OC 1060/580 territory.
 
Last edited:
Too be fair, if you drop the power slider to -50% the perf/watt is actually pretty good. It's just the performance is more like OC 1060/580 territory.

Just why would people who just bought Vega do that lol.....Better buy a 1060 then...
:wtf::shadedshu:
 
AMD should have just set it to optimal when they shipped it. Let the users OC if they want to suck all that power for very little gain. And they should have shaved twenty-five bucks off the price.... Also, it looks like more stock would have been helpful, lol.... And, lets see, what else should AMD have done....? I'm sure u guys will point out the rest.:p
 
Oh give me a rest old man.
 
Too be fair, if you drop the power slider to -50% the perf/watt is actually pretty good. It's just the performance is more like OC 1060/580 territory.
You can't say "to be fair" and then be unfair ... setting the power slider is what makes it non-dynamic ... Raja is somehow speaking of gpu power dynamic range reviewers failed to show while running benches that peg the gpu to the max :kookoo:
If he meant different voltage/temp/power curve profiles in multiple bios modes ... I think reviewers showed those, and here we have perf/watt for all bios modes https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/32.html
 
Last edited:
Just why would people who just bought Vega do that lol.....Better buy a 1060 then...
:wtf::shadedshu:

lol true. For the money the Vega 64 makes a terrible 1060 :)

2017_09_01_10_01_47_3_DMark_Advanced_Edition.png


111 watts power usage.

Just under stock 1070 performance for less wattage is still nice.

You can't say "to be fair" and then be unfair ... setting the power slider is what makes it non-dynamic ... Raja is somehow speaking of gpu power dynamic range reviewers failed to show while running benches that peg the gpu to the max :kookoo:

Core and memory were still dynamic :p All the power save mode does it set the limit to -25%. I just put it to -50%.
 
They did the same thing with Ryzen/Threadripper - hedging their bet by optimizing for workstation/server, then aggressively marketing it towards gamers, then saying it doesn't game very well because we optimized it for compute performance (workstation/server). Also, it beats Intel, as long as you only run the 2 games or 2 benchmarks that favor AMD architecture. Circular logic on an endless loop...
 
Well you may argue that Ryzen 7 is not up to 7700K level of gaming performance, but remember that while 7700K is breaking every drop of sweat in BF1, Ryzen is doing it at 50% core load.
 
Raja would have loved that all reviewers said "Don't worry guys, for only 5% performance loss you get 40% more efficiency for power saving and summer time, with a simple switch"
 
Well you may argue that Ryzen 7 is not up to 7700K level of gaming performance, but remember that while 7700K is breaking every drop of sweat in BF1, Ryzen is doing it at 50% core load.
well surely that happens considering it has double the cores and threads...how many titles are like that now?? Less than a handful...but more as time goes on.


....cue (valid) pricing argument.
 
well surely that happens considering it has double the cores and threads...how many titles are like that now?? Less than a handful...but more as time goes on.


....cue (valid) pricing argument.
you mean how many titles can spread cpu load across cores ? less than a handful ? Am I reading this post from a reviewer ?
 
Core and memory were still dynamic :p All the power save mode does it set the limit to -25%. I just put it to -50%.
I'm simply pointing out how irrelevant all the dynamics (manual or automatic :p) is at the time some heavy benchmark is being run ... if GPU is properly cooled all the graphs of all the sensors are flat lining at max value ... basically, Raja might as well asked reviewers to bench power usage with locked frame rates.
 
I'm simply pointing out how irrelevant all the dynamics (manual or automatic :p) is at the time some heavy benchmark is being run ... if GPU is properly cooled all the graphs of all the sensors are flat lining at max value ... basically, Raja might as well asked reviewers to bench power usage with locked frame rates.
Nah, even cooled properly the frequency is all over the place as it hits power limits. Boosts up, goes over limit, over corrects and boosts way down. Rinse, repeat.
 
Saw the words twitterstorm and immediately though:
Bah just the fake media sprouting fake news. We will make Amd great again:rolleyes:.- orange 45
 
well surely that happens considering it has double the cores and threads...how many titles are like that now?? Less than a handful...but more as time goes on.


....cue (valid) pricing argument.


Yeah, BF1 is nearly best gaming example at about 40-50% CPU load. Titan fall 2 on the other hand is a single core slut. I see CPU as low as 6% on my 1700X. :laugh:
 
Give me my EPYC_VEGAx4GPU, under 350 watt. :peace:
 
Nah, even cooled properly the frequency is all over the place as it hits power limits. Boosts up, goes over limit, over corrects and boosts way down. Rinse, repeat.
I'm sure it's possible to tweak boost clock ranges in crimson drivers wattman to smooth out that tooth saw
 
Last edited:
Tweaking with undervolt and overclock and with better cooling the upcoming custom cards should be a good buy, Vega 56 above 1070 performance and the 64 maybe between 1080 and 1080ti.

 
I'm sure it's possible to tweak boost clock ranges in crimson drivers to smooth out that tooth saw
I run a -8% core clock and 15% memory OC. Provides a pretty consistent experience. 30% power slider increase, but doesn't increase actual draw that much.
 
I run a -8% core clock and 15% memory OC. Provides a pretty consistent experience. 30% power slider increase, but doesn't increase actual draw that much.
Has anyone tried power saving bios mode that is then boosted through wattman till artifacts appear? :laugh:
and btw do you think your consistent hitting the power limits has anything to do with that 30% power slider increase ...
 
Last edited:
Got my Vega 64 in the first week they were sold. Waited 2 weeks more for the pre-ordered EK watercooler. Now its at >1600Mhz all the time and >50% faster than my Fury X before and reaches hardly 50°C. Im satisfied with the increase of gaming performance at 1440p so far. So, tnx AMD, you delivered.
 
Back
Top