• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon Vega 64 Outperforms NVIDIA GTX 1080 Ti in Forza Motorsport 7, DX 12

Joined
Mar 14, 2008
Messages
282 (0.07/day)
Likes
77
Location
DK
System Name Main setup
Processor I7 5930K
Motherboard Asus X99 Deluxe
Cooling Water
Memory Kingston 16GB 2400 DDR4
Video Card(s) Asus GTX 1080TI STRIX OC
Storage Samsung 960EVO, Crucial MX300 750GB Limited edition, Kingston SSDnow V+ Samsung 850 EVO
Display(s) ASUS PB287Q 4K
Audio Device(s) onboard
Power Supply Corsair RX750M
Mouse MMO 800000button thing that a cant remember the name of.
Keyboard Logitech G19
Software W10
#26
Joined
Jun 10, 2014
Messages
929 (0.62/day)
Likes
402
#27
The test wasn't based on the in-game benchmark, so as to avoid specifically-optimized scenarios.
Anything but a reproducible benchmark is completely worthless.

Drivers don't optimize games in real time, in fact, driver optimizations are mainly limited to profiles with driver parameters, with rare cases of tweaked shader programs or special code paths in the driver. The driver never knows the internal state of the game.

This will probably be the case when more than 90% of all games made are dx12 only.
No, this is another case of a console game ported to PC. There is nothing inherent in Direct3D 12 that benefits AMD hardware.
 

Nate1492

New Member
Joined
Sep 11, 2017
Messages
6 (0.02/day)
Likes
4
#28
Anything but a reproducible benchmark is completely worthless.

Drivers don't optimize games in real time, in fact, driver optimizations are mainly limited to profiles with driver parameters, with rare cases of tweaked shader programs or special code paths in the driver. The driver never knows the internal state of the game.


No, this is another case of a console game ported to PC. There is nothing inherent in Direct3D 12 that benefits AMD hardware.
The thing is, there are other 'console ports' that don't follow this pattern.

Mind you, it's not really a port anymore, since they share the same x86 processor, it is more of a 'tweak the game differently'.
 
Joined
Dec 31, 2009
Messages
12,355 (3.95/day)
Likes
6,904
Location
Ohio
System Name Daily Driver
Processor 7960X 4.5GHz 16c/16t 1.17V
Motherboard MSI XPower Gaming Titanium
Cooling MCR320 + Kuplos Kryos NEXT CPU block
Memory GSkill Trident Z 4x8 GB DDR4 3600 MHz CL16
Video Card(s) EVGA GTX 1080 FTW3
Storage 512GB Patriot Hellfire, 512GB OCZ RD400, 640GB Caviar Black, 2TB Caviar Green
Display(s) 27" Acer Predator 2560x1440 144hz IPS + Yamakasi 27" 2560x1440 IPS
Case Thermaltake P5
Power Supply EVGA 750W Supernova G2
Benchmark Scores Faster than most of you! Bet on it! :)
#29
^1080Ti just sayin'
Unlikely, in terms of sheer HP the 1080Ti is still better.
That's a HUUUUUGE assumption.

Didn't they have to adjust a config file on the beta to get it to run past the 60 fps limit or something?

Then, oddly enough, as the res increases, typically AMD cards with HBM.HBM2 do better, however, the less bandwidth NVIDIA cards are catching up... I don't get that. And as such, part of the reason I'm not sold on this becoming a norm.

As I said in the other thread, so far, it is an anomolous result.

I don't think the surprise is that Vega does so well: it's that RX 580 does so poorly at high resolutions. Forza Motorsport 7 is a game made for Xbox One X at 4K and, by extension Polaris but the RX 580 performs well under the GTX 1060 it should easily trump.


I think what's happening is that AMD's memory subsystem rears its head again. When bandwidth isn't a problem, GCN does fantastic. As bandwidth demands increase, GCN takes a bigger blow than Pascal does. Vega manages to stay on top only because of HBM2. In games that are more demanding, HBM2 is not enough to satisfy Vega either.

This is actually quite easily explained by using 8xAA. That's something that's not going to result in a lot of cache misses, hiding AMD's problem and giving a misleading good impression.
I think 8xAA at 4K explains it.
 
Joined
Dec 5, 2006
Messages
7,697 (1.81/day)
Likes
747
Processor i7-5820K
Motherboard Gigabyte X99-UD4
Cooling Corsair H115i Pro
Memory 16GB Crucial DDR4-2400
Video Card(s) Asus Strix GTX 1070
Storage 240GB HyperX 3k, 500GB Samsung 850EVO, 2x WD Black 640gb, WD Black 1TB, Seagate 10TB
Display(s) 2x Dell U2518D 2560x1440
Case BitFenix Ghost
Audio Device(s) Focusrite Saffire PRO 14 -> DBX DriveRack PA+ -> Mackie MR8 and MR10 / Corsair Vengeance 2100
Power Supply Corsair RM1000i
Mouse Logitech G500s
Keyboard Logitech G110
Software Windows 10 x64 Pro
#30
Joined
Nov 5, 2004
Messages
385 (0.08/day)
Likes
104
Location
Belgium, Leuven
Processor I7-6700
Motherboard ASRock Z170 Pro4S
Cooling 2*120mm
Memory G.Skill D416GB 3200-14 Trident Z K2 GSK
Video Card(s) Rx480 Sapphire
Storage SSD Samsung 256GB 850 pro + bunch of TB
Case Antec
Audio Device(s) Creative Sound Blaster Z
Power Supply be quit 900W
Mouse Logitech G5
Keyboard Logitech G11
#31
DX12 games are growing in numbers, making this more relevant day by day.
 
Joined
May 22, 2015
Messages
3,844 (3.32/day)
Likes
1,416
Processor Intel i5-6600k
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x8GB DDR4 2400 G.Skill
Video Card(s) EVGA GTX 1060 SC
Storage 128 and 256GB OCZ Vertex4, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Chieftec BX01
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
#32
DX12 games are growing in numbers, making this more relevant day by day.
I don't know about "day by day". Starting from zero, of course the usage can only go up. But DX12 is 26 months old and we still don't have 26 DX12-only titles. Just imagine if you bought a "futureproof" video card for Christmas 2015 :D
 
Joined
Aug 6, 2009
Messages
688 (0.21/day)
Likes
157
Location
Chicago, Illinois
#33
Forza is a neutral title.

As I see some 1080 owners just popped up and started crying.
Who is crying? For one it's just ONE game performing well . Two, 1080ti owners can probably afford another video card, it's not a big deal. I just got a 1080ti, have a 1080 and can return the 1080ti whenever I want to change cards or upgrade to Vega64 :D

All that being said I'd hate to have to swap cards for whenever I want to play a DX12 game though.
 
Joined
Dec 31, 2009
Messages
12,355 (3.95/day)
Likes
6,904
Location
Ohio
System Name Daily Driver
Processor 7960X 4.5GHz 16c/16t 1.17V
Motherboard MSI XPower Gaming Titanium
Cooling MCR320 + Kuplos Kryos NEXT CPU block
Memory GSkill Trident Z 4x8 GB DDR4 3600 MHz CL16
Video Card(s) EVGA GTX 1080 FTW3
Storage 512GB Patriot Hellfire, 512GB OCZ RD400, 640GB Caviar Black, 2TB Caviar Green
Display(s) 27" Acer Predator 2560x1440 144hz IPS + Yamakasi 27" 2560x1440 IPS
Case Thermaltake P5
Power Supply EVGA 750W Supernova G2
Benchmark Scores Faster than most of you! Bet on it! :)
#34
DX12 games are growing in numbers, making this more relevant day by day.
True... .but.. how long has it been out and how many titles are DX12? Not as much as we would all like to see.

Also, this doesn't happen in other DX12 titles... so, is it really DX12 in the first place? We hope, but are we sure?
 
Joined
Mar 9, 2012
Messages
98 (0.04/day)
Likes
26
Processor Core 2 Quad Q9550 @ 3.7 GHz
Motherboard Gigabyte GA-P35-S3L
Memory 8 GB DDR2-870
Video Card(s) Geforce GTX 1060 6GB
#36
Mind you, it's not really a port anymore, since they share the same x86 processor, it is more of a 'tweak the game differently'.
It doesn't matter if they are all x86. Intel and AMD CPUs have different architectures, and optimizing for one does not mean it'd run just as well on the other.

Same applies to GPUs. Consoles use AMD GPUs, so when porting to PC, developers have to optimize for nvidia too, specially with DX12 where the driver plays a much smaller role compared to DX11, although some don't bother to.

DX12 games are growing in numbers, making this more relevant day by day.
We had 9 non-store DX12 games in 2016 and a grand total of 3 in 2017 so far. That's a decrease. Maybe 2018 would be better?

We could also count store games but that's a slippery slope. Store games can only be DX12 so developers have no choice even if they didn't want to use it. Look at quantum break. It's a DX12 game on store but when it came to steam it became DX11.

We had 5 store only DX12 games in 2016 and 2 in 2017 so far, all published by MS.
 
Last edited:
Joined
Sep 5, 2004
Messages
1,887 (0.37/day)
Likes
257
Location
The Kingdom of Norway
System Name Wiak's Gaming Rig 2017
Processor Ryzen 1700X
Motherboard ASUS PRIME X370-PRO
Cooling Noctua NB-9B SE2
Memory Corsair Vengenace LPX 3200 CL16 @ 2933
Video Card(s) MSI Radeon 480 8GB Gaming X
Storage Samsung 960 EVO 500GB / Samsung 850 EVO 1TB
Display(s) 3x Dell U2412M
Case Corsair 200R
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850
Mouse Corsair Sabre Laser
Keyboard Logitech Orion Brown (G610)
Software Windows 10?
#37
Is this the way to use DX12 the right way scenario or just an implementation where AMD is favored in programming ?

Is the engine just anti-nvidia ?

Will other Games follow where Polaris and Vega can shine?
same way project cars is anti amd ;)
 
Joined
Dec 19, 2008
Messages
174 (0.05/day)
Likes
100
Location
WA, USA
System Name Desktop
Processor AMD Ryzen 1700X
Motherboard ASRock Killer SLI/AC
Cooling Corsair H100i
Memory 32GB DDR4 3200
Video Card(s) PowerColor RX Vega 64
Storage 480GB MyDigitalSSD NVME
Display(s) LG 34UC88
Case NZXT S340
Power Supply 650w
Mouse Razer Deathadder Chroma
Keyboard Razer Ornata Chroma
#38
My guess is this is due to the overhead DX12 games have on Nvidia hardware, similar to how DX11 games are often limited by AMD driver overhead. 1080ti is probably not pegging 100% utilization, similar to this:


With CPU's as fast as they are now I don't think this will crop up too often, but just one example of hardware beating fast software.
 
Joined
Jun 10, 2014
Messages
929 (0.62/day)
Likes
402
#40
I don't know about "day by day". Starting from zero, of course the usage can only go up. But DX12 is 26 months old and we still don't have 26 DX12-only titles. Just imagine if you bought a "futureproof" video card for Christmas 2015 :D
Those who bought "future proof" AMD cards back then are still waiting for them to take off, basing their anecdotes on games which are outliers rather than the norm, and concluding that every game can perform like that on this (superior) hardware.
Meantime, in the real world the competition looks better than ever.
 
Joined
Aug 6, 2009
Messages
688 (0.21/day)
Likes
157
Location
Chicago, Illinois
#41
Those who bought "future proof" AMD cards back then are still waiting for them to take off, basing their anecdotes on games which are outliers rather than the norm, and concluding that every game can perform like that on this (superior) hardware.
Meantime, in the real world the competition looks better than ever.
I don't know man, there are some people who are really, really into Forza that would buy the card just for this game.
 
Joined
Oct 15, 2013
Messages
18 (0.01/day)
Likes
4
System Name My PC
Processor Intel 3770k
Motherboard MSI Z77a-GD65
Cooling Corsair H100i
Memory Corsair 2133mhz 16gb
Video Card(s) xfx HD 7870 Black x2
#42
It doesn't matter if they are all x86. Intel and AMD CPUs have different architectures, and optimizing for one does not mean it'd run just as well on the other.

Same applies to GPUs. Consoles use AMD GPUs, so when porting to PC, developers have to optimize for nvidia too, specially with DX12 where the driver plays a much smaller role compared to DX11, although some don't bother to.


We had 9 non-store DX12 games in 2016 and a grand total of 3 in 2017 so far. That's a decrease. Maybe 2018 would be better?

We could also count store games but that's a slippery slope. Store games can only be DX12 so developers have no choice even if they didn't want to use it. Look at quantum break. It's a DX12 game on store but when it came to steam it became DX11.

We had 5 store only DX12 games in 2016 and 2 in 2017 so far, all published by MS.

Correct me if I'm wrong but I thought I read somewhere that in order for DX12 to be correctly utilized the game engine needed to be built on DX12 and not just tacked into an existing engine.
 
Joined
Jun 10, 2014
Messages
929 (0.62/day)
Likes
402
#43
Correct me if I'm wrong but I thought I read somewhere that in order for DX12 to be correctly utilized the game engine needed to be built on DX12 and not just tacked into an existing engine.
Correct. More or less all games currently uses an abstraction layer to make Direct3D 12 act like Direct3D 11.
 

MxPhenom 216

Corsair Fanboy
Joined
Aug 31, 2010
Messages
12,169 (4.22/day)
Likes
3,718
Location
Seattle, WA
System Name The Battlestation
Processor Intel Core i7 4770k @ 4.2GHZ 1.275v
Motherboard MSi Z97 Gaming 5
Cooling EK Supremacy w/ EK Coolstream PE360
Memory G. Skill Trident X 16Gb (4x4GB) 2400mhz @ 1.65v
Video Card(s) MSi GTX1070 Gaming X 8GB @ 2GHz
Storage Samsung 830 128GB SSD, Crucial MX200 500GB, Seagate Barracuda 2TB (2x 1TB Partitions)
Display(s) Qnix QX2710 27" 2560 x 1440 PLS @ 100hz
Case Phantek Enthoo Evolv ATX TG
Audio Device(s) MSi Gaming AudioBoost ALC1150 w/ Sennheiser Game Ones
Power Supply Seasonic Flagship Prime Platinum 850
Mouse Steelseries Rival 600 w/ Hyper X Mat
Keyboard Corsair K70 w/ MX Browns and Red Backlit
Software Windows 10 Pro 64-Bit
Benchmark Scores Firestrike: 15439
#44
Is this the way to use DX12 the right way scenario or just an implementation where AMD is favored in programming ?

Is the engine just anti-nvidia ?

Will other Games follow where Polaris and Vega can shine?
Considering it was made with Xbox One in mind, id say it just favors AMD hardware.
 
Joined
Sep 26, 2012
Messages
252 (0.12/day)
Likes
114
System Name ATHENA
Processor AMD 1950X
Motherboard ASUS Zenith Extreme
Cooling Noctua NH-U12S, 3xNoctua IndustrialPPC 120mm 2000RPM PWM, 3xSilverstone AP 180mm 1200RPM
Memory 4x16GB Trident-Z 16GB 3866mhz
Video Card(s) ASUS 1080 Ti Strix
Storage 1xIntel 600p 1TB
Display(s) Acer Z35 35" 21:9 2560x1080 200hz
Case Silverstone FT02
Power Supply Silverstone ST1500-TI
Mouse FinalMouse Classic Ergo 2 + Evoluent VerticalMouse C
Keyboard Logitech DinovoEdge
Software Windows 10 + OpenSUSE Tumbleweed
#46
IMO, the texture assets are insane in Forza, and this is just a prime example of how fast can you feed the card being the primary driver.

1080P\1440P - HBM2's lower latency is feeding the GPU faster, even though bandwidth similar to GDDR5X.

4K - Vega VRAM buffer exceeded and caching from system, textures still fully in VRAM for 1080 Ti. Lower FPS also slows the amount of fill requests.
 
Joined
May 31, 2016
Messages
656 (0.84/day)
Likes
111
System Name Bro
Processor Intel I7 3770K @4Ghz
Motherboard ASrock Extreme 4 Z77
Cooling BOX
Memory Kingston 16 GB Hyper Beast XMP @ 1866Mhz
Video Card(s) Gigabyte GTX 780Ti Windforce 1150/7000
Storage 2x SSD OCZ vertex 3 120GB / Sasmsung Black 1TB
Display(s) LG 27UD69 UHD
Case Cooler Master Silverlight
Audio Device(s) realtec 5.1
Power Supply Corsair AXi 760W
Mouse Logitech G402
Keyboard Logitech slim
Software Win 7 ultimate 64bit/windows 10 64 bit
#48
I think there's more things to consider here than HMB2 memory or the fact that DX12 always favors AMD or it's a plugin or other from a console. (BTW consoles use same stuff now as PC's do). I think it's the matter of coding and programmers must code the games to utilize the performance of the cards and what they're capable off plus the resources the card has itself. What i'm saying is there's no 1 straight answer to this. You have to see the bigger picture and put all that matters into one. Saying it's HBM2 makes it faster or anything other is crap and not true. There's more important things in this equation that matter here but I guess it would be hard to list them all since there's so many of them.
 
Joined
Sep 29, 2011
Messages
196 (0.08/day)
Likes
76
Location
Ottawa, Canada
System Name Current Rig
Processor AMD Ryzen 7 1700@3.95GHz
Motherboard Asus X370 Crosshair VI
Cooling Arctic Cooling 240mm
Memory 2x8GB DDR4-3200 G.Skill Trident Z RGB
Video Card(s) Gigabyte Windforce R9 290 (bios flashed to 1050MHz core
Storage 1TB SSD
Display(s) 3x22" LG Flatron (eyefinity)
Case Cooler Master Storm Striker
Power Supply Antec True Power 750w
#49
It's unlikely that MS would favor AMD over NVIDIA "just because". It is known that AMD in general works better in DX12. NVIDIA has a way higher market share. Unless MS wants to create a more equal balance by doing so in favor of AMD. Then again, MS games aren't sold in numbers as high as other games so... not relly sure.
Actually, it's a common misperception that nVidia has a 'higher market share'. Yes, on just the PC platform, nVidia has higher market share, but people forget that the PS/4 and XBOX consoles are really just x86 PCs with some custom hardware tweaks. If you look at it THAT way, then you can see that AMD Radeons likely power >60% of the x86 gaming market. nVidia is actually slowly getting squeezed out of the x86 gaming market.

Here's a link from 2016 showing AMD with 56% of the x86+GPU market--I'm projecting that it's higher now, since consoles generally outsell PC graphics cards (if anyone has anything more recent, please post): https://www.pcgamesn.com/amd/57-per-cent-gamers-on-radeon

Game developers know this, and they are in fact migrating their coding over to Radeon-optimized code paths. And before anybody jumps on this and says 'but nVidia customers spend more money on hardware, well, it's important to understand that to a game developper, some millionaire with two GTX1080 Ti's is only worth the same $60 as a 12 year old with an XBox One. They're both spending $60 on the latest World of Battlefield 7 (yes, made up game) title, so in the end, the developer is looking at which GPU is in the most target platforms.

If I'm Bethesda, for example, I realize that over 60% of the x86 machines that have sufficiently powerful GPUs (XBox, Playstation, PC) that I'd like to put my game on are powered by AMD Radeons. It's only logical that I'm therefore going to build the game engine to run very smoothly on Radeons. All Radeons since the HD7000-series in 2011 have hardware schedulers, which makes them capable of efficiently scheduling/feeding the compute/rendering pipelines from multiple CPU cores right in the GPU hardware. This makes them DX12 optimized, whereas nVidia GPUs, even up to today's 1000-series, still don't have this built-in hardware.

The extra hardware does increase the Radeon's power draw somewhat, and nVidia has made much of how their GPUs are more 'power efficient', but in reality, they're simply missing additional hardware schedulers, which if included in their design, would probably put them on an even level with AMD's power consumption. It's a bit like saying my car is slightly more fuel efficient than yours because I removed the back seats. Sure, you will use a little less gas, but you can't carry any passengers in the back seat, so it's a dubious 'advantage' you're pushing there.

Those who bought "future proof" AMD cards back then are still waiting for them to take off, basing their anecdotes on games which are outliers rather than the norm, and concluding that every game can perform like that on this (superior) hardware.
Meantime, in the real world the competition looks better than ever.
Well, I bought a Radeon R9 290 for $259 Canadian back in December of 2013, which has played every game I like extremely well, and it might just run upcoming fully optimized DX12 titles like Forza-7 as well as or better than an $800 1080 Ti in the 99% frame times, so I'd hardly call that a fail. In fact, when the custom-cooled Vegas hit, I'm probably going to upgrade to one of those in my own rig, and put the R9 290 in my HTPC, because it'll probably still be pumping out great performance for a couple more years to come.
 
Last edited by a moderator:
Joined
Dec 31, 2009
Messages
12,355 (3.95/day)
Likes
6,904
Location
Ohio
System Name Daily Driver
Processor 7960X 4.5GHz 16c/16t 1.17V
Motherboard MSI XPower Gaming Titanium
Cooling MCR320 + Kuplos Kryos NEXT CPU block
Memory GSkill Trident Z 4x8 GB DDR4 3600 MHz CL16
Video Card(s) EVGA GTX 1080 FTW3
Storage 512GB Patriot Hellfire, 512GB OCZ RD400, 640GB Caviar Black, 2TB Caviar Green
Display(s) 27" Acer Predator 2560x1440 144hz IPS + Yamakasi 27" 2560x1440 IPS
Case Thermaltake P5
Power Supply EVGA 750W Supernova G2
Benchmark Scores Faster than most of you! Bet on it! :)
#50
and it might just run upcoming fully optimized DX12 titles like Forza-7 as well as or better than an $800 1080 Ti in the 99% frame times, so I'd hardly call that a fail
I wouldn't hold my breath on that....
 
Top