• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Vega Frontier Ed Beats TITAN Xp in Compute, Formidable Game Performance: Preview

But when AMD makes optimized games, they just optimize their own stuff. When NVIDIA does it, it's optimize their own and actively nerf the competition.
 
But when AMD makes optimized games, they just optimize their own stuff. When NVIDIA does it, it's optimize their own and actively nerf the competition.

No, quite wrong, Gameworks pisses off everyone.

Though to be fair that's due to lazy implementation. And you can turn GW features off in most sensible titles.
 
Don't forget it's up to them to prove to us, not us to have faith in them or make excuses for them.

Think about it, if they make fantastic sales by not being quite as good as their competition, then what's to motivate them to beat their competition?

Because they are not beating the competition right now?
Gotta have revenue to develop better tech, and Nvidia and Intel have a LOT more revenue and Intel is doing nothing with it due to no competition.
So invest in AMD, so they have revenue to develop more to compete with what Intel is finally bringing out as a reaction to Ryzen and Nvidia.
 
Am i the only one who can see that this card totally whoops the nVidia Quadro cards?
look at these figures
this is the quadro score
its rendered at 1900x1060
https://uploads.disquscdn.com/image...09e43f7b822f3d22f5dd8289aca764333820b236c.png
0359c7ac8787f70354c703109e43f7b822f3d22f5dd8289aca764333820b236c.png


this is vega score renderd at 4K but in a window off 1900x1060
vega totally reks the nvidia gpus
vxn1ib.jpg
 
Here are the Quadro results.

Even the Maxwell based card beats this in compute. So on hardware level, the pro drivers make Maxwell better than Vega. Obviously the 6000 range is teh Gx100 (or GX200) core but either way, this M6000 has only got 3072 core count (1000 less than VEGA).

untitled.png


Not sure what RTG are getting at. Titan x/p/xp ali-docious has always been a stupidly expensive gaming card with a limited appeal for compute. We all know that. If you want to show your compute prowess, use the M125 card, not Vega FE. I assume RTG feel the FE is the same as the Titan (i.e. a rip off for the consumer), if so, welcome to the club RTG/AMD, you're about to piss off your fans.

You folks do understand now don't you? This is AMD's Titan card. And they'll charge you for it heavily, just like Nvidia do. So, do we expect all those who vehemently denegrated Nvidia for their pricing are going to come out and have a shot at AMD???

The black shoe called pot is on the other kettle foot now. Or something.


Considering they said It's made for compute its little suprise that it falls behind in gaming, Raj or whoever at RTG even said that. Its meant for oil exploration and real time modeling meaning there are checks in place and forced high accuracy. Speaking of which, what are your thoughts on Nvidia losing performance with HDR tuned on as it forces their whole pipeline to use 16 bit or higher color and calculations, but AMD is seeing no performance penalty? My 7970 plays HDR enabled with no loss that I have noticed.
 
"well should the TDP be an indication of actual power consumption, then this..."victory" is really not all that impressive" then sure.
But you just say "hey it uses a lot of power, its not looking good" which is just in no way linked to this comparison at all.

TDP is an indicator of consumption. Remember Fermi ? Most enthusiast gamers don't want a jet engine in their case and that's why my first post is still valid.
 
Considering they said It's made for compute its little suprise that it falls behind in gaming, Raj or whoever at RTG even said that. Its meant for oil exploration and real time modeling meaning there are checks in place and forced high accuracy. Speaking of which, what are your thoughts on Nvidia losing performance with HDR tuned on as it forces their whole pipeline to use 16 bit or higher color and calculations, but AMD is seeing no performance penalty? My 7970 plays HDR enabled with no loss that I have noticed.
My fury x performs fine.
ran it in 12bit color for nearly 2 years now. have not noticed any difference in fps

Vega is going to be fast, but theres somuch misinformation out there
 
Yikes... no comment on the actual article but... this misinfomation is killing me, lol!
ermmm that has nothing to do with anything in this article... and again, tdp is not the same as actual power consumption.
If I put 10 8-pin connectors on an RX480 the TPD will be through the roof, consumption however stays the same.

but yeah, stop posting unrelated stuff.
And TDP has nothing to do with how many power connectors are on a card. TDP is more closely related to board power than the number of power leads on the card..
 
Considering they said It's made for compute its little suprise that it falls behind in gaming, Raj or whoever at RTG even said that. Its meant for oil exploration and real time modeling meaning there are checks in place and forced high accuracy. Speaking of which, what are your thoughts on Nvidia losing performance with HDR tuned on as it forces their whole pipeline to use 16 bit or higher color and calculations, but AMD is seeing no performance penalty? My 7970 plays HDR enabled with no loss that I have noticed.

Having bought a £2300 HDR 4k LG OLED last year and having very few things to actually watch (even now) on HDR, I really dont care too much. I also only game on an LED Dell monitor I bought in 2011 (U2711b) for £550 (top end back then). I can genuinely say that HDR is good but on a good TV (mine for example), HD at 1080p from a good source (blu-ray) is not that different from 4k HDR (from a good source, UHD Blu Ray).

So I couldnt tell you how Pascal fares on HDR and I wouldn't be willing to shell out the cash for a better monitor. Always about the hardware.... And the LG is too slow for gaming.

FWIW, HDR is really, really a PR product so far with very little support. Give it a few years...

EDIT: streaming HDR at 4k is also not that great - poor compression even on Netflix and Amazon. An HD Blu ray still looks better than streamed UHD (where i live anyway). I do also own a UHD Blu Ray player and the UHD discs are far superior to 4k streamed, but not so much better than HD discs.
 
Funny thing is, AMD compares it's Vega FE to Quadro price wise, even though an 800$ Quadro P4000 wipes the floor with Vega FE. Another funny thing, AMD displaying fps on Vega demos a year ago with Doom and Battlefront demos, and then Sniper Elite 4 fps on Vega FE, but suddenly when TitanXP is in the room fps counters are gone! Yeah total product confidence alright!
It's a more "pro" titan Xp competitor, so more work and less play, but still by no means a quadro competitor (although it wil fair well in most quadro applications probably). For prosumers it's a bloody good card, though!
 
You define 'bloody good' as performance we don't know and power rumored to be 300 and 375w?

Damn I wish I was selling products to people these days... :roll:
 
You define 'bloody good' as performance we don't know and power rumored to be 300 and 375w?

Damn I wish I was selling products to people these days... :roll:
TDP's of 300W and 375W, probably more like 250W and 300W power usage. We have at least had a few benchmarks, enough to see vega FE can hold its own for its usecase at its pricepoint. Unless you mostly use nvidia optimized software or want more "pro" than a "prosumer" card can give you, vega FE is pretty good.
 
Speaking of which, what are your thoughts on Nvidia losing performance with HDR tuned on as it forces their whole pipeline to use 16 bit or higher color and calculations, but AMD is seeing no performance penalty? My 7970 plays HDR enabled with no loss that I have noticed.
FYI: All current games do fragment processing in fp32, which in the end is downsampled into the final framebuffer. Games are already doing HDR internally, but have been using tone mapping until now. Using HDR output should not have a negative impact on performance.
 
Having bought a £2300 HDR 4k LG OLED last year and having very few things to actually watch (even now) on HDR, I really dont care too much. I also only game on an LED Dell monitor I bought in 2011 (U2711b) for £550 (top end back then). I can genuinely say that HDR is good but on a good TV (mine for example), HD at 1080p from a good source (blu-ray) is not that different from 4k HDR (from a good source, UHD Blu Ray).

So I couldnt tell you how Pascal fares on HDR and I wouldn't be willing to shell out the cash for a better monitor. Always about the hardware.... And the LG is too slow for gaming.

FWIW, HDR is really, really a PR product so far with very little support. Give it a few years...

EDIT: streaming HDR at 4k is also not that great - poor compression even on Netflix and Amazon. An HD Blu ray still looks better than streamed UHD (where i live anyway). I do also own a UHD Blu Ray player and the UHD discs are far superior to 4k streamed, but not so much better than HD discs.

too slow?
22ms is fine for gaming for me
human reaction time is around 300-500ms anyway

HDR= higher bit per color
thats supposed to eleminate color banding
so anything under blueray and advertised in hdr is not worth it
 
TDP's of 300W and 375W, probably more like 250W and 300W power usage. We have at least had a few benchmarks, enough to see vega FE can hold its own for its usecase at its pricepoint. Unless you mostly use nvidia optimized software or want more "pro" than a "prosumer" card can give you, vega FE is pretty good.
Never saw board power be so much less than the TDP... we have seen business uses, not games... or at least not games that are highly optimized for AMD in the first place. Its more pro than consumer.
 
TDP's of 300W and 375W, probably more like 250W and 300W power usage. We have at least had a few benchmarks, enough to see vega FE can hold its own for its usecase at its pricepoint. Unless you mostly use nvidia optimized software or want more "pro" than a "prosumer" card can give you, vega FE is pretty good.

When in the history of AMD have they released a GPU that offered substantially lower power usage vs TDP?
 
Meh, too late imo.

Even though I love Radeon graphics I just snagged a second reference 1080 Ti for like 550€, guess I'm fine for a couple of years.
 
Was there compute benches somewhere? I only see specviewperf(d3d11,ogl) and cinebench R15(ogl), neither benchmarks compute performance.
 
TDP is an indicator of consumption. Remember Fermi ? Most enthusiast gamers don't want a jet engine in their case and that's why my first post is still valid.

Im not saying its not an indicator.
Im saying the framing of the comment is just off.
Unless you add reason for suddenly talking about power consumption of either card, its a random new variable that is not mentioned/present in the Article.
The article is purely comparing the performance is a set of tests between the two cards.

I mean you might as well start talking about the colors of the card, or how convenient it is to fit one in your case, or store availability.
Unless you also provide context like I did in my example, it has nothing to do with the article and thus is unrelated.
 
Because they are not beating the competition right now?
Gotta have revenue to develop better tech, and Nvidia and Intel have a LOT more revenue and Intel is doing nothing with it due to no competition.
So invest in AMD, so they have revenue to develop more to compete with what Intel is finally bringing out as a reaction to Ryzen and Nvidia.
Well, I'll let you do that investing then!

The situation's a bit of a catch 22 really, because they need the money for R&D and we need to look after our interests by buying the best products at the time which may not be AMD. It's up to them to find the money some other way, perhaps with loans, investments etc. The kind of thing that highly paid corporate accountants are good at. ;)

No, they're not beating Intel on IPC unfortunately, which means that games run faster on Intel and that's an important selling feature that will directly affect their sales. There's loads of reviews out there that show this, including a couple on TPU.
 
No, quite wrong, Gameworks pisses off everyone.

Though to be fair that's due to lazy implementation. And you can turn GW features off in most sensible titles.

Not picking on you but there is really no reason to be lazy. Especially when Nvidia is sponsoring and has gone on the record on several tech interviews (print and video) via Tom Petersen (PCPerspective) saying they send a team of GameWorks tech engineers to the game companies to help with GameWorks on titles.

Not really lazy implementation at that point.
 
Last edited:
dude.... article is vega FE vs Titan, and my post is was vega FE vs titan. And they do most likely not set TDP that high for no reason.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/28.html

Here it says that even FE of 1080Ti consumes ~270W max.

https://www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Ti_Amp_Extreme/29.html

Here again, it shows that a factory oced 1080Ti is able to consume up to 320W.

In conclusion, don't look at the official numbers of TDP to make assumptions about true consumption of CPU or GPU. My estimation is that full RX Vega will consume close to 280W. The top version with WC might surpass 300W. Normal for top tier gaming cards imho. And TDP limit allows some oc too, so it MUST be higher than the stock clock consumption.
 
Doom with Vulkan.
Prey (is DX11 anyway)
Sniper Elite DX12

Cherry. That's all.

However, not too long now, only another month and a bit. And a bit more. As mentioned too, AMD have been humping Nvidia in compute for ages now on their gaming based cards. But to be realistic, I don't use compute to game.

Actually you do. Many newer game titles use compute and the amount of compute being used in games is only increasing.
 
Hopefully someone will bench this poncy FE card soon, my God... this is the most epically drawn out launch EVAR.
 
Back
Top