• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon Pro Vega Frontier Edition Unboxed, Benchmarked

Quite a few more large games have dx12 options. There are thousands of games released every year, so good luck getting through all of them, but there definitely are more games with dx12 (options) than bf1 amd forza. DX12 is mainstream, vulkan is niche and yet there are roughly as many games of each according to wikipedia. The vulkan list may be right, but the dx12 list sure as hell isn't. Just about every new game tested seems to have optional dx12 nowadays and barely any of them made it the list.

Many of the games without API stated can run in dx12 mode it seems.
LOL, where to start. DX12 is certainly not mainstream yet, and every new game does NOT seemingly have a DX12 mode.

In any case I haven't yet seen a need for DX12 in any game I have that has it optional. DX11 is working just fine for me...nothing has taxed my system unrealistically yet where DX12 might even be necessary to aid performance.

@I No AMD hasn't hyped this. The legion of rabid AMD fanatic fanboys has. I don't know why you sound so disappointed. Wait til RX Vega releases before being happy or disappointed.
 
Last edited:
After 10 years of similar fan and cooling designs you expect THIS time it changes everything how radial coolers have worked? Having reviewed and/or owned probably in excess of 100 graphics card, very many with radial fan designs, I do find your ignorance to reality a bit insulting.


I too have owned, worked on, installed and heard many radial blower design cards, Nvidia has managed to keep many quiet and between the fan blade design, shroud, fins, tolerance and much else two seemingly identical coolers can be significantly different in noise, and what pitch or timbre of noise is produced. Thanks for calling me ignorant.
 
I too have owned, worked on, installed and heard many radial blower design cards, Nvidia has managed to keep many quiet and between the fan blade design, shroud, fins, tolerance and much else two seemingly identical coolers can be significantly different in noise, and what pitch or timbre of noise is produced. Thanks for calling me ignorant.
How many of your NVIDIA cards have had fan at over 4000rpm?

Want to check latest similar fan design from AMD, here is TPU RX 480 review.
Now add 100W more heat to be transferred off even beefier fin array.

fannoise_load.png
 
Quite a few more large games have dx12 options. There are thousands of games released every year, so good luck getting through all of them, but there definitely are more games with dx12 (options) than bf1 amd forza. DX12 is mainstream, vulkan is niche and yet there are roughly as many games of each according to wikipedia. The vulkan list may be right, but the dx12 list sure as hell isn't. Just about every new game tested seems to have optional dx12 nowadays and barely any of them made it the list.

Many of the games without API stated can run in dx12 mode it seems.
yawn.. links...
 
How many of your NVIDIA cards have had fan at over 4000rpm?

Want to check latest similar fan design from AMD, here is TPU RX 480 review.
Now add 100W more heat to be transferred off even beefier fin array.

fannoise_load.png
I've always taken W1zzard's noise chart with a grain of salt, and you should too. According to that chart my 780 I used to have should have been very audible, but I never even heard it, even at full load.
 
I've always taken W1zzard's noise chart with a grain of salt, and you should too. According to that chart my 780 I used to have should have been very audible, but I never even heard it, even at full load.
No need to trust wizzard, youtube has other RX 480 fan speed noise tests.


5000 rpm, 2600 rpm etc.

RX480 doesn't need to push fanspeed so far, but vega fe seems to do it.
 
How many of your NVIDIA cards have had fan at over 4000rpm?

Want to check latest similar fan design from AMD, here is TPU RX 480 review.
Now add 100W more heat to be transferred off even beefier fin array.

fannoise_load.png




Longer fin array from pictures, but the fan itself is what is creating the noise for the most part. The fan itself looks different than the RX480 stock fan in small ways. I'm by no means saying it is silent, but the guy with one said clearly that its not noisy, and he has a watercooled CPU so I am guessing if it were noisy he would know the difference.

I've always taken W1zzard's noise chart with a grain of salt, and you should too. According to that chart my 780 I used to have should have been very audible, but I never even heard it, even at full load.

5870's were quiet for the day, but I hated the way mine sounded before the water block went on.
 
You're not forced to do anything, ever; that's one :)
On top of that, you have zero, absolute no reason to even think that the gaming-oriented version will cost more than Nvidia's; don't know where you saw that 'extra', but it goes with what i was saying before. We need to approach this more.. reasonably. Feels like you've added some hyperbole to the equation, then used it to conclude you're right to be disappointed.
Now in terms of the hype.. why did you allow any hype to influence you in the first place? Are we not in the 21st century? Who is it that has yet to learn how marketing is done or, even better, how better it is to let the customers do the marketing for the company? You've only yourself to blame if you fell for that, and that's the truth.

As to the latter part i quote above; NO. That's you thinking like that; skewed perceptions, a market gone haywire because that's how they prefer you to think. Do all of you here drive Porsches? Live in $1000000 mansions? No. If you get a second toddler and are in need of a new house, what you gonna do? Buy the mansion or go under the river?
Whom is it that said we must want the 'bestest'?
And by the way, because 99% i know the reply to this last bit.. if let's say it's 5% slower than the 1080Ti.. is that 'worstest'? 'Fail'? How exactly could you measure anything when all you go by is 'bestest'? Again, rather.. skewed criteria there.

*same goes for price-performance ratio.. who told you that's what everyone wants? Some have money to burn and do so just because, some purchase due to an emotional drive, others due to past impressions/familiarity, etc etc.. this "everyone" is just dead wrong. Keep it limited to what -you- want. And why it is you allowed yourself to form an opinion before it was even out (hype).

Take it or leave it, am not defending them. Am only showing you how this is very, very problematic; and most folks don't even see it no more.

Ok so basically what you're saying is that i'm not forced to get a product that's best suited for my needs because i'm not forced to do so? Wait what? What does this have to do with a segment in the market that's running completely unopposed at the time of writing? (This would be where those "high hopes" I mentioned would fit in)
Hype is "this will blow X product out of the water" Realism is "They had a whole year to get this at least on OK levels"- And this isn't OK from where i'm sitting. Most of the users/people/whatever do actually care about price/performance more so when we're talking about PC parts. But please permit me to doubt that the same chip in a "gaming" version will do a far better job.
Theory crafting, if it turns up to be 5% bellow or above the Ti then good that means the market will have competition and it will force a reaction thus pushing the industry further (note Intel's reaction to Ryzen).
People buy whatever is suited for their needs. If they feel the need for a $2000 GPU/CPU/whatever they will throw money at it.
I don't really care if it's red blue green purple or has polka-dots on it as long as it is incentive for innovation.
My bad on the usage of "Everyone" should've sticked with "the majority".
 
**put the waders on**

The amount of ...less than spectacular displays of knowledge... posts show how many are in need of doing research before guessingposting.

In the meantime, please dont forget this card is not a gaming card and the drivers could be considered as beta level. :cool:
 
Wasn't even running 1600mhz like it should. If it was running at something like 1400mhz that would mean a 1600mhz score of about 19.5k at least. Besides, we don't know what support's like, we don't know how well rx vega will do and most importantly, firestrike isn't an actual game. Lots of if's, dunno's and a guy doing the benchmarks that doesn't understand wattman. Great, now we still don't know how good rx vega is.
You don't know how boost works.
Makers usually specify the base clock and typical/average boost clock. But this time AMD have chosen to specify typical clock and peak boost clock, probably to get a higher peak throughput number to "beat" the competition. However, if we employ the same method to Nvidia, the rated "performance" of Titan Xp would become 14.2 Tflop/s (assuming 1850 MHz peak clock).
 
You don't know how boost works.
Makers usually specify the base clock and typical/average boost clock. But this time AMD have chosen to specify typical clock and peak boost clock, probably to get a higher peak throughput number to "beat" the competition. However, if we employ the same method to Nvidia, the rated "performance" of Titan Xp would become 14.2 Tflop/s (assuming 1850 MHz peak clock).


The guy with the card posted that he had to change the WattMan settings to get it to stay at 1600Mhz and only found that today when playing The Witcher
 
When are the consumer gaming orientated versions due?
 
So.... It's a FuryX with higher clocks and "better" memory....


Edit:
My point of view :

1. No driver update in this world would boost it by 30-35% (where the hype-train showed it would land).
2. Prosumer card or not it's marketed with a "GAMING MODE". Last time I checked the Titan didn't perform worse than the "gaming" version although the Titan is clearly aimed at gaming (no pro driver support) but it also has some deep-learning usage where it is 20-25% faster than a 1080 Ti.
3. The price is outrageous for what it can do (don't get me wrong the Titan is a rip-off as well) in both gaming and/or pro use
4. AMD did state they won't be supplying the card for reviewers. Yet they were the ones that made a comparison vs a Titan in the first place (mind you in pro usage with the Titan on regular drivers - Thanks nVidia for that).
5. No sane person would pick this vs a Quardo or hell even a WX variant for pro usage.
6. AMD dropped the ball when they showed a benchmark vs a Titan (doesn't even matter what was the object of the bench itself). nVidia being them never showed a slide where the competition is mentioned they only compare the new gen vs the current gen (tones down the hype).
7. Apart from the 700 series every Titan delivered what was promised(and that's cuz they didn't use the full chip in the Titan).

Long story short rushed, unpolished release. Jack all support. Denying the reviewers a sample. AMD is trying to do what nVidia does with the Titan but it's clearly not in the position to do so. In other words AMD just pulled another classic "AMD cock-up". Again this is my POV, had high hopes for this ... but then again I should've known better...

Seems pretty much spot on. The Vega will also deliver what what was promised. The first "prosumer" card, lol. Ever since ATI became AMD that's all they've seemed to be able to sell--prosumer cards. The architecture always seems so generic and too future-proof. Too open and reliant on people devoting their free time to help "the cause"...
 
Last edited:
I'm actually wondering why this card exists.
It's not a gaming card, and at $1000 it can't compete with similar offerings.
It's not a workstation card, as it gets beaten by much smaller (die) ws gpus...

The only reason I can think of as to why is AMD wasting Vega chips on this abomination (and not on RX) is probably due to the early chips being utter crap. Hence the delay between FE and RX Vega.

A jack of all trades, but pretty mediocre at all of them.
 
I'm actually wondering why this card exists.
It's not a gaming card, and at $1000 it can't compete with similar offerings.
It's not a workstation card, as it gets beaten by much smaller (die) ws gpus...

The only reason I can think of as to why is AMD wasting Vega chips on this abomination (and not on RX) is probably due to the early chips being utter crap. Hence the delay between FE and RX Vega.

A jack of all trades, but pretty mediocre at all of them.

It's the best they could do with a shrunken Fiji....
Now, if they could hit 2Ghz.... Nvidia seems to have whooped AMD's ass on the way to 2Ghz. Oh well, you can't win ALL of the Ghz battles, AMD.
The slogan should have been "Vega--The Bulldozer of GPUs," the RX Vega's will be "RX vega--the Piledriver of GPU's," and hopefully by the time Navi 2.0 comes out we will have something better than Ryzen to compare it too.
I hate to hate AMD, but I hope the next revision of Ryzen will offer something other than more cores and a mediocre IPC.
Done ranting for the night. Later guys! :pimp:
 
When are the consumer gaming orientated versions due?

Are you new here or just to the Vega 'situation'?

And my lord, there are so many terrible, stupid, comments filled with BS that I wish I had more faces and palms.
 
But wait!!!!!!!!! Drivers aren't mature and this dude didn't potentially didn't set up configurations for it......... Ugh, the speculation to play both sides to the middle........... bleh.


Guys, GPUs don't 'brown out' and get "slower" because of an inadequate PSU... ;)

Uhmmm. Jayztwocents made a video about that today. Even the cables you use from the PSU mattered in benchmarking. Insignificant but there was a change.
 
Are you new here or just to the Vega 'situation'?

And my lord, there are so many terrible, stupid, comments filled with BS that I wish I had more faces and palms.

I'm new to the V3ga 'situation'!

But i hear ya, it's facepalm galore.
 
I'm new to the V3ga 'situation'!

But i hear ya, it's facepalm galore.

Vega has basically been AMD teasing everyone with the cards for what seems like an eternity. A lot of people don't even think they'll come out with the consumer versions at this point, others said F it and bought 1080s/1080tis. I bought a 1080 but am still somewhat thinking about buying Vega but it's getting to the point where Nvidia will have something that stomps all over it if AMD ever decides to release it.
 
High Band Width Cache may be amazing on paper. The truth is Vega is 2048bit versus FuryX 4096bit. And Vega has less total Vram bandwidth as well. I wonder whether this has anything to do with the not so promising performance of Vega. After all, no applications or games are optomized for HBC yet.
 
So AMD's super duper overpriced flagship card is only as fast as NVIDIA's second tier card and consumes lots of power. Sounds about right. :rolleyes: Kinda the same thing as Ryzen and Intel, but there's not always quite so much difference there at least.

Maybe in another couple of generations AMD will catch up, but I'm not holding my breath.

Perhaps the upcoming gaming oriented RX Vega will be better, but I doubt it.
 
AMD releases a $1000 workstation card that's comparable to a $5000 card from the competition. Nice work, AMD!
Looking forward to RX Vega in a little over a month and some real reviews. Reference AMD cards tend to be hot and loud, so tack on another month for custom cooling solutions to appear.
 
So.... It's a FuryX with higher clocks and "better" memory....

No. If you believe this ridiculousness, then Fiji is much faster (about same scoring as Fury X at 1.25 GHz).

I guess you gotta rile up fanboys for attention, though.

Might as well ignore all posts for a couple months.
 
No. If you believe this ridiculousness, then Fiji is much faster (about same scoring as Fury X at 1.25 GHz).

I guess you gotta rile up fanboys for attention, though.

Might as well ignore all posts for a couple months.


I agree. If the keep 4096bit width just switch to HBM2, use 14nm to boost up clock speed then it might be better.

And yeah I know I know next to squat about GPU electronic engineering yadayada.
 
Back
Top