• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

New Performance Benchmarks of AMD's Vega Frontier Edition Surface

Joined
Sep 6, 2013
Messages
2,978 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Actually, look more closely at what they said. You are putting words in AMD's mouth. They basically said that devs can use this card from development thru testing without having to switch out cards when testing.
They do say that the card is "optimized for every stage of this workflow". If the card had inferior performance in final game testing, then that statement is misleading.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
They do say that the card is "optimized for every stage of this workflow". If the card had inferior performance in final game testing, then that statement is misleading.
Not really. They don't have to have the blistering performance that we as consumers demand in order to optimize a game.

The real problem here are the sheer number of people who never read these professional card descriptions are suddenly doing so, and trying to apply their views onto something that is foreign to them.

As an example, I've played plenty of mods in different games in which the creator had a low level GPU. Does that mean I cannot crank up awesome details on what he or she produced? Nope. He provides that capability and only needs to test that is works as intended, not see it at the same level as I can.
 
Last edited:
Joined
Jan 13, 2011
Messages
219 (0.05/day)
It's good at things that you'd otherwise need a $3000+ Quadro...
Yeah but for 3000 dollars I get full support, ECC and drivers that are actually reliable and known hardware support from 3rd party software. That may not matter to some homegamer but to a corporation it sure the hell does. Dress it up however you want the card is a dud as it essentially has no market outside of fanboys or miners who at this point will take anything that works for them, although maybe not with the watts this beast draws.
 
Joined
Sep 6, 2013
Messages
2,978 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Not really. They don't have to have the blistering performance that we as consumers demand in order to optimize a game.

The real problem here are the sheer number of people who never read these professional card descriptions are suddenly doing so, and trying to apply their views onto something that is foreign.
The problem here is not about having or not blistering performance. The problem here is that people insist that FE is inferior in gaming. And if I understood correctly, they don't talk about lower frequencies. It's just inferior, because it's not a gaming card. As simple as that. It is build to be inferior. OK, that would be great news for the RX card. But that doesn't go well with "optimized for every stage of this workflow". It can't be optimized in every stage and at the same time build to be inferior in one at least stage of that workflow.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
But that doesn't go well with "optimized for every stage of this workflow". It can't be optimized in every stage and at the same time build to be inferior in one at least stage of that workflow.
Sure it does. Optimized to be a "do everything at every stage" does not mean it has to be better than any gaming card. Are you really under the impression that game devs are making games on GPU beasts that would blow all our GPU's out of the water?

No, they just make the program and make sure it works as intended as they go along with enough detail for them to see their work. The tests of their implementation of the game engine comes later with gaming cards.
 
Last edited:
Joined
Jun 28, 2016
Messages
3,595 (1.26/day)
It's good at things that you'd otherwise need a $3000+ Quadro...

Like what? In the few tests we've seen Vega FE lost to a Titan Xp ($1200). The cards are in fact very similar and the gap in performance and efficiency is just what you expect when looking at RX580 vs GTX1080.
Other than performance, it doesn't have any of Quadro bonuses - an existing pro support for example.

And BTW: is AMD planning a Vega-based Radeon Pro? What about the FirePro line?
Will there be a competitor to GP100 (i.e. high FP64)?
You missed the fact that AMD said it wasnt a gaming card. Most of your arguments are as delusional as this statement.
Maybe AMD said it wasn't a gaming card, because it doesn't perform well in games? Sadly, it's also not good at anything else. It's just "some card".
And some here are even praising AMD officials for openly saying that this card is awful and they should wait for RX. Just how twisted is that? :eek:

AMD lineup becomes a mess. They had gaming cards and pro cards. It all worked. The cards even used to be pretty good few years back.
Now they released a new segment, which is so confusing that even they got lost (just check their websites).

So when you compare Vega FE to similar cards based on Pascal, Pascal wins.

But you can't compare, because Vega is different. It's so different that it sits alone in it's cave of misunderstood and overlooked greatness. They should borrow competitor's naming convention and call it AMD Copernicus. Of course other than the fact that Copernicus did something forever important and this is just a card.
If NVIDIA chooses to join the race and make a Pascal-based "pro Titan" (which - looking at the Vega benchmarks - would only need a rebranding) we'll finally be able to officially call Vega s..t.

No, they just make the program and make sure it works as intended as they go along with enough detail for them to see their work. The tests of their implementation of the game engine comes later with gaming cards.
And until now they were using normal gaming GPUs. Why do we need a "pro" card for game development? What does this card offer over 1080?
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.27/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
And until now they were using normal gaming GPUs. Why do we need a "pro" card for game development? What does this card offer over 1080?

By spec more compute performance
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
And until now they were using normal gaming GPUs. Why do we need a "pro" card for game development? What does this card offer over 1080?

Now THAT is the best question asked in this thread!;) I have no answer, other than it appears AMD is trying to create that role. If you get an answer, I would love to know.

Of course, it could be that your premise is flawed, and that a number of game devs are not using gaming cards, but some form of pro card.
 
Last edited:
Joined
Feb 2, 2015
Messages
2,707 (0.80/day)
Location
On The Highway To Hell \m/
AMD isn't going to let Vega FE's gaming performance be as good as RX Vega. Not after telling everyone time and time again that's not what we should be expecting from Vega FE. So how would they make sure that's actually going to be the case? Given that Vega FE hardware-wise is almost exactly the same as RX Vega is going to be. Except for RX Vega having 8GB less HBM2. There's only one logical conclusion to make as to what they've done. They disabled some feature(s) in the drivers. Namely the Draw Stream Binning Rasterizer. Which, I don't care what efikkan has to say on the matter, is driver/software controlled. Rasterization is not entirely a function of hardware. In this case the driver can tell the GPU what type of rasterizer to use or not use. DBSR or not DBSR. And it's been shown, by PCPer with trianglebin tool, that DBSR or tile-based rasterization is not currently working with Vega FE.


So why has AMD done this with the drivers for Vega FE? Because it has little to no impact on non-gaming performance or professional workloads. It just, intentionally, gimps Vega FE for gaming purposes. And they really do NOT want Vega FE to appeal to gamers(not that it would at the $1000+ price point anyway). They need gamers to buy up the RX Vega when it's released. By that time the drivers will suddenly be "fixed". DBSR will work with FE and RX. FE owners will get a nice gaming performance boost as a result. But in comparison to how well RX is going to perform it will still not justify having bought an FE for gaming. So there, hopefully, won't be any pissed off gamers wishing they'd saved their money for later and bought the much cheaper and better for gaming RX, instead of the FE. Because they were repeatedly warned ahead of time not to buy the FE for gaming. And shown very convincing benchmarks proving they shouldn't. If they did it anyway, then that's their own dumbass fault and they have nobody to blame but themselves. And that's why AMD is doing this. To keep gamers, and Vega owners in general(FE and RX), happy. Plain and simple. Because Vega FE will never game like the RX Vega will. They know already that. And they're doing their best to keep gamers from making a bad decision by purchasing the Vega FE for gaming. And, all the while, still letting "prosumers" have a generous piece of the Vega gaming pie too. And they will be praised for doing so when all is said and done. And not talked shit about for not doing so. Because they did. It was the smart thing to do. It was the right thing to do. And in the end = happy gamers, happy Vega owners all around(FE and RX), happy AMD. Everyone's happy.

Sounds stupid to you? Well it's a good thing I'm not asking you then isn't it. I'm telling you this is the way it is. If you don't believe it I don't really care. Time, and only time, will tell if what I've said here is true or isn't.
 
Last edited:
Joined
Sep 15, 2011
Messages
6,471 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Maybe AMD said it wasn't a gaming card, because it doesn't perform well in games? Sadly, it's also not good at anything else. It's just "some card".
:roll::roll::roll::roll::roll::roll::roll::laugh::laugh::laugh::laugh::laugh::laugh:
 
Joined
Mar 31, 2012
Messages
828 (0.19/day)
Location
NL
System Name SIGSEGV
Processor INTEL i7-7700K | AMD Ryzen 2700X
Motherboard QUANTA | ASUS Crosshair VII Hero
Cooling Air cooling 4 heatpipes | Corsair H115i | Noctua NF-A14 IndustrialPPC Fan 3000RPM
Memory Micron 16 Gb DDR4 2400 | GSkill Ripjaws 32Gb DDR4 3200 3400(OC) 14-14-14-34 @1.38v
Video Card(s) Nvidia 1060 6GB | Gigabyte 1080Ti Aorus
Storage 1TB 7200/256 SSD PCIE | ~ TB | 970 Evo
Display(s) 15,5" / 27"
Case Black & Grey | Phanteks P400S
Audio Device(s) Realtek
Power Supply Li Battery | Seasonic Focus Gold 750W
Mouse g402
Keyboard Leopold|Ducky
Software LinuxMint KDE |UBUNTU | Windows 10 PRO
Benchmark Scores i dont care about scores
i thought at first that this card is built to fill Machine Learning/Deep Learning "gap" solution from AMD and not for gaming. I am so tempted to get this card to build deep learning server but now I am so lost and confused after reading many comments here. lol
 
Joined
Sep 6, 2013
Messages
2,978 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Sure it does. Optimized to be a "do everything at every stage" does not mean it has to be better than any gaming card. Are you really under the impression that game devs are making games on GPU beasts that would blow all our GPU's out of the water?

No, they just make the program and make sure it works as intended as they go along with enough detail for them to see their work. The tests of their implementation of the game engine comes later with gaming cards.
Who says about better? I am only saying "not much worst", compared to a gaming card with the same number of stream processors, using the same architecture, having the same word, "Vega", in it's name. And who said that you need a GPU beast to program? In that first paragraph you really implying things that where never been said, to make my point of view to look biased.
As for the second paragraph, thats the idea behind FE. Not to have to use a gaming card to see how the game engine REALLY performs. Let me be more specific here. How it really performs under a gaming card with the same specs, using the same architecture, with only frequencies probably being different.
i thought at first that this card is built to fill Machine Learning/Deep Learning "gap" solution from AMD and not for gaming. I am so tempted to get this card to build deep learning server but now I am so lost and confused after reading many comments here. lol
You need Instinct for this kind of jobs.
Radeon Instinct™ MI Series
 
Last edited:
Joined
Feb 8, 2012
Messages
3,013 (0.68/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
The draw-stream binning rasterizer won't always be the rasterization approach that a Vega GPU will use. Instead, it's meant to complement the existing approaches possible on today's Radeons. AMD says that the DSBR is "highly dynamic and state-based," and that the feature is just another path through the hardware that can be used to improve rendering performance.
It's obvious that driver has to have dynamic DSBR control enabled. What I wouldn't like to see is games having DSBR disabled all together in the driver game profile, because it crashes the game, or frequent switching between modes.
 

Fx

Joined
Oct 31, 2008
Messages
1,332 (0.24/day)
Location
Portland, OR
Processor Ryzen 2600x
Motherboard ASUS ROG Strix X470-F Gaming
Cooling Noctua
Memory G.SKILL Flare X Series 16GB DDR4 3466
Video Card(s) EVGA 980ti FTW
Storage (OS)Samsung 950 Pro (512GB), (Data) WD Reds
Display(s) 24" Dell UltraSharp U2412M
Case Fractal Design Define R5
Audio Device(s) Sennheiser GAME ONE
Power Supply EVGA SuperNOVA 650 P2
Mouse Mionix Castor
Keyboard Deck Hassium Pro
Software Windows 10 Pro x64
Some people try to improve their status on forums by spewing negative comments, as soon as any tiny problem arises with a new or upcoming product, acting as though they have "inside information", and making stuff up. Some people are afraid of being seen as a newbie or fanboy by saying they like something, it's easier to put down everything so they seem edgy and cool (at least in their own mind). Nothing new, I've noticed this tendency in bullies and the small-minded since I was in first grade (50 years ago), and now these same losers all have computers and can spew their hate to a much larger group of people. This negativity is a reliable indicator that they can be safely ignored Or, you can read their comments and take away a reinforced positive impression of the product (if this nimrod feels threatened by it, it must be a good product).

LOL. Sad, but true. People do tend to spew what they believe their peers want to hear rather than adding new lines of thought to a debate.

I still maintain my stance of wait until the consumer cards have arrived and get benchmarked.
 
Joined
Nov 5, 2004
Messages
385 (0.05/day)
Location
Belgium, Leuven
Processor I7-6700
Motherboard ASRock Z170 Pro4S
Cooling 2*120mm
Memory G.Skill D416GB 3200-14 Trident Z K2 GSK
Video Card(s) Rx480 Sapphire
Storage SSD Samsung 256GB 850 pro + bunch of TB
Case Antec
Audio Device(s) Creative Sound Blaster Z
Power Supply be quit 900W
Mouse Logitech G5
Keyboard Logitech G11
Well, this sounds.....PROMISING!!!!

I vividly recall that the "realistic" guess was that it(the actual gaming card with drivers some months old) would preform between 1070 and 180 speed.

I see now that the FE with like alpha drivers is achieving those values already.

So add the extra speed of the Gaming edition and add the fact that it isn´t the actual decent gaming driver. Let this soak in some months of driver optimization (against the 1080 whose drivers are now solid and wont improve that much anymore)

- Gaming edition
- Drivers
- 1080 drivers already optimal
- AMD drivers age much better
- ...pricing(cost/performance)?

My guess is that the intel+nvidia combination will be beaten by the AMD+AMD combination :)

Clear win for AMD here folks
 
Joined
Sep 2, 2011
Messages
1,019 (0.22/day)
Location
Porto
System Name No name / Purple Haze
Processor Phenom II 1100T @ 3.8Ghz / Pentium 4 3.4 EE Gallatin @ 3.825Ghz
Motherboard MSI 970 Gaming/ Abit IC7-MAX3
Cooling CM Hyper 212X / Scythe Andy Samurai Master (CPU) - Modded Ati Silencer 5 rev. 2 (GPU)
Memory 8GB GEIL GB38GB2133C10ADC + 8GB G.Skill F3-14900CL9-4GBXL / 2x1GB Crucial Ballistix Tracer PC4000
Video Card(s) Asus R9 Fury X Strix (4096 SP's/1050 Mhz)/ PowerColor X850XT PE @ (600/1230) AGP + (HD3850 AGP)
Storage Samsung 250 GB / WD Caviar 160GB
Display(s) Benq XL2411T
Audio Device(s) motherboard / Creative Sound Blaster X-Fi XtremeGamer Fatal1ty Pro + Front panel
Power Supply Tagan BZ 900W / Corsair HX620w
Mouse Zowie AM
Keyboard Qpad MK-50
Software Windows 7 Pro 64Bit / Windows XP
Benchmark Scores 64CU Fury: http://www.3dmark.com/fs/11269229 / X850XT PE http://www.3dmark.com/3dm05/5532432
Well, this sounds.....PROMISING!!!!

I vividly recall that the "realistic" guess was that it(the actual gaming card with drivers some months old) would preform between 1070 and 180 speed.

I see now that the FE with like alpha drivers is achieving those values already.

So add the extra speed of the Gaming edition and add the fact that it isn´t the actual decent gaming driver. Let this soak in some months of driver optimization (against the 1080 whose drivers are now solid and wont improve that much anymore)

- Gaming edition
- Drivers
- 1080 drivers already optimal
- AMD drivers age much better
- ...pricing(cost/performance)?

My guess is that the intel+nvidia combination will be beaten by the AMD+AMD combination :)

Clear win for AMD here folks

Well Vega FE is being beaten by Xtreme Addict's 1400MHz fully unlocked Fury Strix in fire strike extreme. (Vega FE runs @1600MHz)

http://forum.hwbot.org/showthread.php?t=142320

The driver issues are clear.
 
Joined
Jun 10, 2014
Messages
2,902 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
…Raja on the other hand says "Don't bother with FE. RX is just around the corner. It will be cheaper, probably clocked higher and the gaming card you are looking for. No reason to go and spent $300-$500 more".
Raja saying that the FE card is not for gaming, doesn't mean that FE is bad in gaming. Just that there is no reason to go and pay for extra pro features in drivers, that are useless in games. He is just more honest.

Also, no one says that those two cards will be 100% identical. RX will have probably higher frequencies and maybe better cooling solution. I wouldn't be surprised if the RX is at the same level of performance, or a little better if clocked higher, compared to the liquid version of FE. Now, if AMD manages also to enable some features that are now disabled in the next days, features that increase performance in games, those features will become available to the FE card also, improving it's performance in games. You'll see that, if it happens.
You pretty much nailed it there. There is no reason for gamers to pay extra for features they don't need, which is why it's not meant for only gaming, meaning it's not the best value option, not that it's any worse at it. The same goes for Titan and Quadro; they are excellent at gaming, but are not the best options if you're doing only gaming.
 
Top