• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Tom Clancy's "The Division" Gets DirectX 12 Update, RX 480 Beats GTX 1060 by 16%

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.29/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
Anyway Fabio myth blown, time to move on.

Get w1z to do a review with the new driver and both overclocked and I'll care. Posting anything off the [h] typically just makes the world believe the opposite.
 
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Get w1z to do a review with the new driver and both overclocked and I'll care. Posting anything off the [h] typically just makes the world believe the opposite.

I'd love to, if it stopped you not caring this much.
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.29/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
I'd love to, if it stopped you not caring this much.

It bothers me that people believe what [H] puts out. It reminds me way to much of people believing CNN.

Now on the other end of the spectrum has anyone seen the 2800-3000mhz clocks they are getting out of the galax version of the 1060? HOLY HELL.
 
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
I posted about that Galax card too, but they hated on it for being under LN2.

I can't win.
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.29/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
Joined
Sep 29, 2011
Messages
217 (0.05/day)
Location
Ottawa, Canada
System Name Current Rig
Processor Intel 12700K@5.1GHz
Motherboard MSI Pro Z790-P
Cooling Arctic Cooling Liquid Freezer II 360mm
Memory 2x16GB DDR5-6000 G.Skill Trident Z RGB
Video Card(s) MSI Gaming X Trio 6800 16GB
Storage 1TB SSD
Case Cooler Master Storm Striker
Power Supply Antec True Power 750w
Keyboard IBM Model 'M"
This is great news for AMD users, but only when DX12 is properly implemented, which was sadly a lottery so far.

A lottery with increasingly excellent odds of winning for AMD Radeon users.

nVidia left out hardware schedulers from Pascal back in the design phase, before nVidia realized AMD had pressured Microsoft into incorporating DX12 (essentially Mantle) into Windows 10. By the time nVidia knew this, it was too late to change Pascal. nVidia hoped the adoption of Windows 10 would be slow, but Microsoft gave it away for free for nearly a year.

So much for nVidia's plans. They were hoping to milk not-so-bright nVidiots over a longer time frame, before they lost the gaming war with AMD (an inevitability, as AMD now has >25% of the overall x86 gaming market), but AMD had other plans, and Microsoft is a willing accomplice. Now nVidia is pushing like mad to get into self-driving cars and high performance computing because their days of making the big $$ from add-in PC gaming GPUs is coming to an end, much like the add-in sound card days for SoundBlaster.

Once AMD releases Zen APUs with Vega graphics and their new memory fabric, the market for the mid-range add-in GPUs will begin to evaporate just as the low-end add-in GPU board market mostly has. nVidia is getting painted into an ever-smaller unit volume market at the high-end, which is ironic, really, considering what a rip off the price-performance proposition is for a $600 GTX1080. Why anyone continues to funnel that kind of money to the Green Goblin is beyond me. I never spend more than $300 (maybe $320) on a graphics card, and that's my hard limit.
 
Joined
Nov 5, 2004
Messages
385 (0.05/day)
Location
Belgium, Leuven
Processor I7-6700
Motherboard ASRock Z170 Pro4S
Cooling 2*120mm
Memory G.Skill D416GB 3200-14 Trident Z K2 GSK
Video Card(s) Rx480 Sapphire
Storage SSD Samsung 256GB 850 pro + bunch of TB
Case Antec
Audio Device(s) Creative Sound Blaster Z
Power Supply be quit 900W
Mouse Logitech G5
Keyboard Logitech G11
A lottery with increasingly excellent odds of winning for AMD Radeon users.

nVidia left out hardware schedulers from Pascal back in the design phase, before nVidia realized AMD had pressured Microsoft into incorporating DX12 (essentially Mantle) into Windows 10. By the time nVidia knew this, it was too late to change Pascal. nVidia hoped the adoption of Windows 10 would be slow, but Microsoft gave it away for free for nearly a year.

So much for nVidia's plans. They were hoping to milk not-so-bright nVidiots over a longer time frame, before they lost the gaming war with AMD (an inevitability, as AMD now has >25% of the overall x86 gaming market), but AMD had other plans, and Microsoft is a willing accomplice. Now nVidia is pushing like mad to get into self-driving cars and high performance computing because their days of making the big $$ from add-in PC gaming GPUs is coming to an end, much like the add-in sound card days for SoundBlaster.

Once AMD releases Zen APUs with Vega graphics and their new memory fabric, the market for the mid-range add-in GPUs will begin to evaporate just as the low-end add-in GPU board market mostly has. nVidia is getting painted into an ever-smaller unit volume market at the high-end, which is ironic, really, considering what a rip off the price-performance proposition is for a $600 GTX1080. Why anyone continues to funnel that kind of money to the Green Goblin is beyond me. I never spend more than $300 (maybe $320) on a graphics card, and that's my hard limit.

I would just love to have you be right.

However, M$ wasn´t "pushed" as an accomplice; but they stole the concept of Mantle and incorporated that to counter AMD. (and lose market share&influence)

What is true is that the ngreedia fellows tried to milk this stagnation for as long as possible; but suddenly came face to face with a rapid DX12 roll-out.

Seeing as NV has tons of money more for R&D opposed to what AMD has; my guess is they already made headway in development that are kept hush-hush. I am sure they just have some tech on the shelf they can pull out of their *$$ to combat DX12.

I imagine that whilst AMD will have better DX12 usage, we will see NV using "something" to just crank out more power from what they have.
A reverse situation where NV has much more TFLOPs than AMD in their cards is certainly possible.
 
Joined
Sep 17, 2014
Messages
20,776 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
A lottery with increasingly excellent odds of winning for AMD Radeon users.

nVidia left out hardware schedulers from Pascal back in the design phase, before nVidia realized AMD had pressured Microsoft into incorporating DX12 (essentially Mantle) into Windows 10. By the time nVidia knew this, it was too late to change Pascal. nVidia hoped the adoption of Windows 10 would be slow, but Microsoft gave it away for free for nearly a year.

So much for nVidia's plans. They were hoping to milk not-so-bright nVidiots over a longer time frame, before they lost the gaming war with AMD (an inevitability, as AMD now has >25% of the overall x86 gaming market), but AMD had other plans, and Microsoft is a willing accomplice. Now nVidia is pushing like mad to get into self-driving cars and high performance computing because their days of making the big $$ from add-in PC gaming GPUs is coming to an end, much like the add-in sound card days for SoundBlaster.

Once AMD releases Zen APUs with Vega graphics and their new memory fabric, the market for the mid-range add-in GPUs will begin to evaporate just as the low-end add-in GPU board market mostly has. nVidia is getting painted into an ever-smaller unit volume market at the high-end, which is ironic, really, considering what a rip off the price-performance proposition is for a $600 GTX1080. Why anyone continues to funnel that kind of money to the Green Goblin is beyond me. I never spend more than $300 (maybe $320) on a graphics card, and that's my hard limit.

So then you've never been into buying high-end cards, you will always choose midrange and you blame AMD's failure to keep pushing a high end portfolio on Nvidia who still IS able to squeeze 30% more perf into a 225 w TDP every year. Meanwhile you consider 25% a healthy market share when there are only two companies in competition.

In the meantime the majority of DX12 ports are not really showing any gains on DX12 for either company, only a small handful of games do. The native DX12 games are extremely rare still.

Sense, it makes none

It's good that DX12 is paying off for AMD, but the actual fact is that AMD counted on that for waaaay too long, which is the reason they lost their market share under DX11. The company that is much closer to market reality is actually Nvidia, because even today they can still easily transition to DX12, and across the board their cards still do more with less power. Dedicated GPU is here to stay for atleast a few more decades, if not for gaming then it will be for GPGPU and scientific purposes, or deep learning, AI, etc etc etc. GPU is a swiss army knife and the companies will always find a use for it, just like the CPU is super versatile already. Gaming is just a tiny slice of the GPU pie, even if we would love to believe something else.
 
Last edited:

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,851 (3.08/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000\Paradigm P Studio 20, Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
But nowhere near GTX1070 performance as some people claim an RX480 can match a GTX1070....:p
RX480 is GTX1060 territory.;)

Wish full thinking.

How i see it is $200 card v's a $400 card, and for what ? 20fps extra about so is it worth it and to be honest only the buyer can decide that.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,163 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Wish full thinking.

How i see it is $200 card v's a $400 card, and for what ? 20fps extra about so is it worth it and to be honest only the buyer can decide that.
Expressing the difference between two video cards in FPS is wrong. What's 20fps? The difference between 2 and 22? 42 and 62? 202 and 222? See how it doesn't work?
% makes more sense. But realistically speaking, 1070 enables you to play at QHD while 480 can only do FHD. Whether that's worth the price difference is indeed for the user to judge.
 
Joined
Jun 28, 2014
Messages
2,388 (0.67/day)
Location
Shenandoah Valley, Virginia USA
System Name Home Brewed
Processor i9-7900X and i7-8700K
Motherboard ASUS ROG Rampage VI Extreme & ASUS Prime Z-370 A
Cooling Corsair 280mm AIO & Thermaltake Water 3.0
Memory 64GB DDR4-3000 GSKill RipJaws-V & 32GB DDR4-3466 GEIL Potenza
Video Card(s) 2X-GTX-1080 SLI & 2 GTX-1070Ti 8GB G1 Gaming in SLI
Storage Both have 2TB HDDs for storage, 480GB SSDs for OS, and 240GB SSDs for Steam Games
Display(s) ACER 28" B286HK 4K & Samsung 32" 1080P
Case NZXT Source 540 & Rosewill Rise Chassis
Audio Device(s) onboard
Power Supply Corsair RM1000 & Corsair RM850
Mouse Generic
Keyboard Razer Blackwidow Tournament & Corsair K90
Software Win-10 Professional
Benchmark Scores yes
When you double-up and run Crossfire, you begin to see much better performance numbers. This is where the $235.00 for each RX480 8GB GPU begins to make sense. You can get two of these for not much more than one 1070.
A pair of RX480 8GB cards do all of my games nicely and without any lag to speak of. My 4K screen is running at 60Hz speed, and these two RX480s saturate it so that 4K resolutions are playable without jerking me around. (Shmooth!)
Yes, I know that the 1070 and 1080 cards will be quicker. (but for a lot more money)
1060s don't even factor in because NVIDIA chose to hobble the 60 series of GPUs in SLI this time around. (probably because they ~knew without a doubt~ that we would have jumped at the chance to SLI a pair of GTX-1060s and keep some money at home to eat with)
I have a pair of GTX-980Ti cards that I pulled out of this PC just to test out the 480s for a while. 980Ti cards in SLI are wonderful and probably what I'll keep buying instead of being disemboweled by NVIDIA for the newest thing.
 
Last edited:

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,851 (3.08/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000\Paradigm P Studio 20, Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
Expressing the difference between two video cards in FPS is wrong. What's 20fps? The difference between 2 and 22? 42 and 62? 202 and 222? See how it doesn't work?
% makes more sense. But realistically speaking, 1070 enables you to play at QHD while 480 can only do FHD. Whether that's worth the price difference is indeed for the user to judge.

For the most part but some of us have a more dedicated game that must improve. to me most benchmarks even in game ones mean pretty much nothing to me and just a little guide as games like Arma cannot done correctly in benchmarks.

In the end it's down to the user and if the benchmarked game in question was my go to game the 480 would be good enough and having the 1070 would be a waste.

Some require 30+ some require 60+ and now these days even more want 100+. As long as i get 35fps+ i am a happy gamer.

All about user requirements, i don't require 4k. my 290x's is showing it age a little but it still don't justify $200 over the 480 although that said for me the 480 is totally pointless to me.
 

bug

Joined
May 22, 2015
Messages
13,163 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
For the most part but some of us have a more dedicated game that must improve. to me most benchmarks even in game ones mean pretty much nothing to me and just a little guide as games like Arma cannot done correctly in benchmarks.

In the end it's down to the user and if the benchmarked game in question was my go to game the 480 would be good enough and having the 1070 would be a waste.

Some require 30+ some require 60+ and now these days even more want 100+. As long as i get 35fps+ i am a happy gamer.

All about user requirements, i don't require 4k. my 290x's is showing it age a little but it still don't justify $200 over the 480 although that said for me the 480 is totally pointless to me.
The original argument was 480 vs 1070 and the price difference in general.
Sure, there are corner cases and exceptions to everything, but in this particular case, your statement does not hold.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
However, M$ wasn´t "pushed" as an accomplice; but they stole the concept of Mantle and incorporated that to counter AMD. (and lose market share&influence)
Developers demanded Mantle and, as an extension of that, Xbox One required DirectX 12. Both technologies were closer-to-the-metal so developers could squeeze more performance out of them. DirectX 12 was almost the last 3D API to go closer-to-the-metal. The trend started, I believe, back with PlayStation 3 which had an OpenGL-based closer-to-the-metal implementation to squeeze more power out of the CELL processor. Sony ported that library to PlayStation 4 adopting and expanding it for AMD's APU. The push for DirectX 12 really began in earnest when Microsoft saw the performance figures of Xbox One compared to PlayStation 4. Not only did PlayStation 4 have a beefier APU, it also had a closer-to-the metal API for accessing it which further accelerated its performance. Microsoft looked to Mantle as an example of what they could accomplish with the APU they already have. Xbox One apparently received the DirectX 12 API update at the end of 2015.

AMD has a tight working relationship with Microsoft because of the Xbox One which gave AMD a leg up in terms of DirectX 12. The API was practically designed to run on GCN (which Xbox One has). That said, NVIDIA has demonstrated that Pascal can run DirectX 12 just as well when optimized for it with Futuremark's TimeSpy.

A reverse situation where NV has much more TFLOPs than AMD in their cards is certainly possible.
AMD tends to have more TFLOPs on paper (exception right now because AMD hasn't launched a response to Pascal until Vega) but they also tend to leave more shaders idle.


Back to topic, I tried Direct3D 12 in The Division and it does seem to get a few more frames but it breaks alt+tab functionality causing the game to crash when attempting to restore the window. Because of that, I run the game on Direct3D 11.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,163 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Developers demanded Mantle and, as an extension of that, Xbox One required DirectX 12. Both technologies were closer-to-the-metal so developers could squeeze more performance out of them. DirectX 12 was almost the last 3D API to go closer-to-the-metal. The trend started, I believe, back with PlayStation 3 which had an OpenGL-based closer-to-the-metal implementation to squeeze more power out of the CELL processor. Sony ported that library to PlayStation 4 adopting and expanding it for AMD's APU. The push for DirectX 12 really began in earnest when Microsoft saw the performance figures of Xbox One compared to PlayStation 4. Not only did PlayStation 4 have a beefier APU, it also had a closer-to-the metal API for accessing it which further accelerated its performance. Microsoft looked to Mantle as an example of what they could accomplish with the APU they already have. Xbox One apparently received the DirectX 12 API update at the end of 2015.

What developers failed to realise is that on console they only had to deal with a handful of configurations at most, while the PC world is an entirely different beast. Now that developers have to put they money where their mouth was, we get DX11 titles with DX12 bolted on instead.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
That's going to change pretty quickly. 48.3% of machines that participated in Steam Hardware Survey is installed on run on DirectX 12 GPU and Windows 10. An additional 25% have a DirectX 12 GPU and are running Windows 7 or 8.1 which could upgrade to fully support the API.

That said, major engines like Unreal Engine 4 are still struggling to adopt DirectX 12. DirectX 12 represents a pretty major paradigm shift in 3D engine design so adaption in software is going to be slower than even DirectX 9 to DirectX 10. Games like The Division bolting on Direct3D 12 support are examples of developers dipping their toes into the API. Eventually we'll start seeing Direct3D 12 games built from the ground up with Direct3D 11 bolted on for backwards compatibility. These are the games that will seriously benefit from the closer-to-the-metal APIs.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,163 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
That's going to change pretty quickly. 48.3% of machines that participated in Steam Hardware Survey is installed on run on DirectX 12 GPU and Windows 10. An additional 25% have a DirectX 12 GPU and are running Windows 7 or 8.1 which could upgrade pretty quickly to fully support the API.

That said, major engines like Unreal Engine 4 are still struggling to adopt DirectX 12. DirectX 12 represents a pretty major paradigm shift in 3D engine design so adaption in software is going to be slower than even DirectX 9 to DirectX 10. Games like The Division bolting on Direct3D 12 support are examples of developers dipping their toes into the API. Eventually we'll start seeing Direct3D 12 games built from the ground up with Direct3D 11 bolted on for backwards compatibility. These are the games that will seriously benefit from the closer-to-the-metal APIs.
My guess is many (smaller) developer either won't have the resources or the will to implement (and test) DX12 and will stick to DX11 instead. Same for Vulkan and OpenGL.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Smaller developers are dependent on the engine they're using. The bulk of them are on Unreal Engine (discussed previously) and Unity. Unity is supposed to support DirectX 12 but I don't know if they do/when they implemented it. I do know Unity 5.6 supports Vulkan.
 
Top