• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

New Performance Benchmarks of AMD's Vega Frontier Edition Surface

Actually, look more closely at what they said. You are putting words in AMD's mouth. They basically said that devs can use this card from development thru testing without having to switch out cards when testing.
They do say that the card is "optimized for every stage of this workflow". If the card had inferior performance in final game testing, then that statement is misleading.
 
They do say that the card is "optimized for every stage of this workflow". If the card had inferior performance in final game testing, then that statement is misleading.
Not really. They don't have to have the blistering performance that we as consumers demand in order to optimize a game.

The real problem here are the sheer number of people who never read these professional card descriptions are suddenly doing so, and trying to apply their views onto something that is foreign to them.

As an example, I've played plenty of mods in different games in which the creator had a low level GPU. Does that mean I cannot crank up awesome details on what he or she produced? Nope. He provides that capability and only needs to test that is works as intended, not see it at the same level as I can.
 
Last edited:
It's good at things that you'd otherwise need a $3000+ Quadro...
Yeah but for 3000 dollars I get full support, ECC and drivers that are actually reliable and known hardware support from 3rd party software. That may not matter to some homegamer but to a corporation it sure the hell does. Dress it up however you want the card is a dud as it essentially has no market outside of fanboys or miners who at this point will take anything that works for them, although maybe not with the watts this beast draws.
 
Not really. They don't have to have the blistering performance that we as consumers demand in order to optimize a game.

The real problem here are the sheer number of people who never read these professional card descriptions are suddenly doing so, and trying to apply their views onto something that is foreign.
The problem here is not about having or not blistering performance. The problem here is that people insist that FE is inferior in gaming. And if I understood correctly, they don't talk about lower frequencies. It's just inferior, because it's not a gaming card. As simple as that. It is build to be inferior. OK, that would be great news for the RX card. But that doesn't go well with "optimized for every stage of this workflow". It can't be optimized in every stage and at the same time build to be inferior in one at least stage of that workflow.
 
But that doesn't go well with "optimized for every stage of this workflow". It can't be optimized in every stage and at the same time build to be inferior in one at least stage of that workflow.
Sure it does. Optimized to be a "do everything at every stage" does not mean it has to be better than any gaming card. Are you really under the impression that game devs are making games on GPU beasts that would blow all our GPU's out of the water?

No, they just make the program and make sure it works as intended as they go along with enough detail for them to see their work. The tests of their implementation of the game engine comes later with gaming cards.
 
Last edited:
It's good at things that you'd otherwise need a $3000+ Quadro...

Like what? In the few tests we've seen Vega FE lost to a Titan Xp ($1200). The cards are in fact very similar and the gap in performance and efficiency is just what you expect when looking at RX580 vs GTX1080.
Other than performance, it doesn't have any of Quadro bonuses - an existing pro support for example.

And BTW: is AMD planning a Vega-based Radeon Pro? What about the FirePro line?
Will there be a competitor to GP100 (i.e. high FP64)?
You missed the fact that AMD said it wasnt a gaming card. Most of your arguments are as delusional as this statement.
Maybe AMD said it wasn't a gaming card, because it doesn't perform well in games? Sadly, it's also not good at anything else. It's just "some card".
And some here are even praising AMD officials for openly saying that this card is awful and they should wait for RX. Just how twisted is that? :eek:

AMD lineup becomes a mess. They had gaming cards and pro cards. It all worked. The cards even used to be pretty good few years back.
Now they released a new segment, which is so confusing that even they got lost (just check their websites).

So when you compare Vega FE to similar cards based on Pascal, Pascal wins.

But you can't compare, because Vega is different. It's so different that it sits alone in it's cave of misunderstood and overlooked greatness. They should borrow competitor's naming convention and call it AMD Copernicus. Of course other than the fact that Copernicus did something forever important and this is just a card.
If NVIDIA chooses to join the race and make a Pascal-based "pro Titan" (which - looking at the Vega benchmarks - would only need a rebranding) we'll finally be able to officially call Vega s..t.

No, they just make the program and make sure it works as intended as they go along with enough detail for them to see their work. The tests of their implementation of the game engine comes later with gaming cards.
And until now they were using normal gaming GPUs. Why do we need a "pro" card for game development? What does this card offer over 1080?
 
And until now they were using normal gaming GPUs. Why do we need a "pro" card for game development? What does this card offer over 1080?

By spec more compute performance
 
And until now they were using normal gaming GPUs. Why do we need a "pro" card for game development? What does this card offer over 1080?

Now THAT is the best question asked in this thread!;) I have no answer, other than it appears AMD is trying to create that role. If you get an answer, I would love to know.

Of course, it could be that your premise is flawed, and that a number of game devs are not using gaming cards, but some form of pro card.
 
Last edited:
AMD isn't going to let Vega FE's gaming performance be as good as RX Vega. Not after telling everyone time and time again that's not what we should be expecting from Vega FE. So how would they make sure that's actually going to be the case? Given that Vega FE hardware-wise is almost exactly the same as RX Vega is going to be. Except for RX Vega having 8GB less HBM2. There's only one logical conclusion to make as to what they've done. They disabled some feature(s) in the drivers. Namely the Draw Stream Binning Rasterizer. Which, I don't care what efikkan has to say on the matter, is driver/software controlled. Rasterization is not entirely a function of hardware. In this case the driver can tell the GPU what type of rasterizer to use or not use. DBSR or not DBSR. And it's been shown, by PCPer with trianglebin tool, that DBSR or tile-based rasterization is not currently working with Vega FE.


So why has AMD done this with the drivers for Vega FE? Because it has little to no impact on non-gaming performance or professional workloads. It just, intentionally, gimps Vega FE for gaming purposes. And they really do NOT want Vega FE to appeal to gamers(not that it would at the $1000+ price point anyway). They need gamers to buy up the RX Vega when it's released. By that time the drivers will suddenly be "fixed". DBSR will work with FE and RX. FE owners will get a nice gaming performance boost as a result. But in comparison to how well RX is going to perform it will still not justify having bought an FE for gaming. So there, hopefully, won't be any pissed off gamers wishing they'd saved their money for later and bought the much cheaper and better for gaming RX, instead of the FE. Because they were repeatedly warned ahead of time not to buy the FE for gaming. And shown very convincing benchmarks proving they shouldn't. If they did it anyway, then that's their own dumbass fault and they have nobody to blame but themselves. And that's why AMD is doing this. To keep gamers, and Vega owners in general(FE and RX), happy. Plain and simple. Because Vega FE will never game like the RX Vega will. They know already that. And they're doing their best to keep gamers from making a bad decision by purchasing the Vega FE for gaming. And, all the while, still letting "prosumers" have a generous piece of the Vega gaming pie too. And they will be praised for doing so when all is said and done. And not talked shit about for not doing so. Because they did. It was the smart thing to do. It was the right thing to do. And in the end = happy gamers, happy Vega owners all around(FE and RX), happy AMD. Everyone's happy.

Sounds stupid to you? Well it's a good thing I'm not asking you then isn't it. I'm telling you this is the way it is. If you don't believe it I don't really care. Time, and only time, will tell if what I've said here is true or isn't.
 
Last edited:
Maybe AMD said it wasn't a gaming card, because it doesn't perform well in games? Sadly, it's also not good at anything else. It's just "some card".
:roll::roll::roll::roll::roll::roll::roll::laugh::laugh::laugh::laugh::laugh::laugh:
 
i thought at first that this card is built to fill Machine Learning/Deep Learning "gap" solution from AMD and not for gaming. I am so tempted to get this card to build deep learning server but now I am so lost and confused after reading many comments here. lol
 
Sure it does. Optimized to be a "do everything at every stage" does not mean it has to be better than any gaming card. Are you really under the impression that game devs are making games on GPU beasts that would blow all our GPU's out of the water?

No, they just make the program and make sure it works as intended as they go along with enough detail for them to see their work. The tests of their implementation of the game engine comes later with gaming cards.
Who says about better? I am only saying "not much worst", compared to a gaming card with the same number of stream processors, using the same architecture, having the same word, "Vega", in it's name. And who said that you need a GPU beast to program? In that first paragraph you really implying things that where never been said, to make my point of view to look biased.
As for the second paragraph, thats the idea behind FE. Not to have to use a gaming card to see how the game engine REALLY performs. Let me be more specific here. How it really performs under a gaming card with the same specs, using the same architecture, with only frequencies probably being different.
i thought at first that this card is built to fill Machine Learning/Deep Learning "gap" solution from AMD and not for gaming. I am so tempted to get this card to build deep learning server but now I am so lost and confused after reading many comments here. lol
You need Instinct for this kind of jobs.
Radeon Instinct™ MI Series
 
Last edited:
The draw-stream binning rasterizer won't always be the rasterization approach that a Vega GPU will use. Instead, it's meant to complement the existing approaches possible on today's Radeons. AMD says that the DSBR is "highly dynamic and state-based," and that the feature is just another path through the hardware that can be used to improve rendering performance.
It's obvious that driver has to have dynamic DSBR control enabled. What I wouldn't like to see is games having DSBR disabled all together in the driver game profile, because it crashes the game, or frequent switching between modes.
 
Some people try to improve their status on forums by spewing negative comments, as soon as any tiny problem arises with a new or upcoming product, acting as though they have "inside information", and making stuff up. Some people are afraid of being seen as a newbie or fanboy by saying they like something, it's easier to put down everything so they seem edgy and cool (at least in their own mind). Nothing new, I've noticed this tendency in bullies and the small-minded since I was in first grade (50 years ago), and now these same losers all have computers and can spew their hate to a much larger group of people. This negativity is a reliable indicator that they can be safely ignored Or, you can read their comments and take away a reinforced positive impression of the product (if this nimrod feels threatened by it, it must be a good product).

LOL. Sad, but true. People do tend to spew what they believe their peers want to hear rather than adding new lines of thought to a debate.

I still maintain my stance of wait until the consumer cards have arrived and get benchmarked.
 
Well, this sounds.....PROMISING!!!!

I vividly recall that the "realistic" guess was that it(the actual gaming card with drivers some months old) would preform between 1070 and 180 speed.

I see now that the FE with like alpha drivers is achieving those values already.

So add the extra speed of the Gaming edition and add the fact that it isn´t the actual decent gaming driver. Let this soak in some months of driver optimization (against the 1080 whose drivers are now solid and wont improve that much anymore)

- Gaming edition
- Drivers
- 1080 drivers already optimal
- AMD drivers age much better
- ...pricing(cost/performance)?

My guess is that the intel+nvidia combination will be beaten by the AMD+AMD combination :)

Clear win for AMD here folks
 
Well, this sounds.....PROMISING!!!!

I vividly recall that the "realistic" guess was that it(the actual gaming card with drivers some months old) would preform between 1070 and 180 speed.

I see now that the FE with like alpha drivers is achieving those values already.

So add the extra speed of the Gaming edition and add the fact that it isn´t the actual decent gaming driver. Let this soak in some months of driver optimization (against the 1080 whose drivers are now solid and wont improve that much anymore)

- Gaming edition
- Drivers
- 1080 drivers already optimal
- AMD drivers age much better
- ...pricing(cost/performance)?

My guess is that the intel+nvidia combination will be beaten by the AMD+AMD combination :)

Clear win for AMD here folks

Well Vega FE is being beaten by Xtreme Addict's 1400MHz fully unlocked Fury Strix in fire strike extreme. (Vega FE runs @1600MHz)

http://forum.hwbot.org/showthread.php?t=142320

The driver issues are clear.
 
…Raja on the other hand says "Don't bother with FE. RX is just around the corner. It will be cheaper, probably clocked higher and the gaming card you are looking for. No reason to go and spent $300-$500 more".
Raja saying that the FE card is not for gaming, doesn't mean that FE is bad in gaming. Just that there is no reason to go and pay for extra pro features in drivers, that are useless in games. He is just more honest.

Also, no one says that those two cards will be 100% identical. RX will have probably higher frequencies and maybe better cooling solution. I wouldn't be surprised if the RX is at the same level of performance, or a little better if clocked higher, compared to the liquid version of FE. Now, if AMD manages also to enable some features that are now disabled in the next days, features that increase performance in games, those features will become available to the FE card also, improving it's performance in games. You'll see that, if it happens.
You pretty much nailed it there. There is no reason for gamers to pay extra for features they don't need, which is why it's not meant for only gaming, meaning it's not the best value option, not that it's any worse at it. The same goes for Titan and Quadro; they are excellent at gaming, but are not the best options if you're doing only gaming.
 
Back
Top