• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The secret of Doom (2016) performance on AMD GPUs

Joined
Apr 18, 2013
Messages
1,260 (0.32/day)
Location
Artem S. Tashkinov
It's simple: the game was developed specifically for GCN as reported by a leading developer.

doom_vs_nvidia.png


Watch the presentation from SIGGRAPH2016.
 
Joined
Jan 2, 2012
Messages
1,079 (0.24/day)
Location
Indonesia
Processor AMD Ryzen 7 5700X
Motherboard ASUS STRIX X570-E
Cooling NOCTUA NH-U12A
Memory G.Skill FlareX 32 GB (4 x 8 GB) DDR4-3200
Video Card(s) ASUS RTX 4070 DUAL
Storage 1 TB WD Black SN850X | 2 TB WD Blue SN570 | 10 TB WD Purple Pro
Display(s) LG 32QP880N 32"
Case Fractal Design Define R5 Black
Power Supply Seasonic Focus Gold 750W
Mouse Pulsar X2
Keyboard KIRA EXS
Joined
Apr 30, 2011
Messages
2,648 (0.56/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
It's simple: the game was developed specifically for GCN as reported by a leading developer.

View attachment 77413

Watch the presentation from SIGGRAPH2016.
In OpenGL where it was played at launch and until recently, nVidia was much better. So, no matter how the game was developed, awful AMD drivers for OpenGL couldn't put their GPUs to work properly. When Vulkan came around, AMD GPUs were fully used and took over since Vulkan is Mantle's successor. To sum up, Vulkan is made to use perfectly AMD GPUs and use nVidia ones a bit better than OpenGL. Where is the news there?
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.96/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
upload_2016-7-29_11-40-24.png

Not that I'm trying to stab at nVidia but, I find it very interesting that the engine starts the rendering of the next frame before the current frame has been post-processed by any filters when utilizing async compute. Doing that will always favor GPUs with more CUs/SMs when the support is there.
 
Joined
Sep 2, 2011
Messages
1,019 (0.22/day)
Location
Porto
System Name No name / Purple Haze
Processor Phenom II 1100T @ 3.8Ghz / Pentium 4 3.4 EE Gallatin @ 3.825Ghz
Motherboard MSI 970 Gaming/ Abit IC7-MAX3
Cooling CM Hyper 212X / Scythe Andy Samurai Master (CPU) - Modded Ati Silencer 5 rev. 2 (GPU)
Memory 8GB GEIL GB38GB2133C10ADC + 8GB G.Skill F3-14900CL9-4GBXL / 2x1GB Crucial Ballistix Tracer PC4000
Video Card(s) Asus R9 Fury X Strix (4096 SP's/1050 Mhz)/ PowerColor X850XT PE @ (600/1230) AGP + (HD3850 AGP)
Storage Samsung 250 GB / WD Caviar 160GB
Display(s) Benq XL2411T
Audio Device(s) motherboard / Creative Sound Blaster X-Fi XtremeGamer Fatal1ty Pro + Front panel
Power Supply Tagan BZ 900W / Corsair HX620w
Mouse Zowie AM
Keyboard Qpad MK-50
Software Windows 7 Pro 64Bit / Windows XP
Benchmark Scores 64CU Fury: http://www.3dmark.com/fs/11269229 / X850XT PE http://www.3dmark.com/3dm05/5532432
Current and future consoles are GCN based.
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,378 (2.37/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
It's simple: the game was developed specifically for GCN as reported by a leading developer.

View attachment 77413

Watch the presentation from SIGGRAPH2016.

I watched the whole damn thing. It absolutely was not written for AMD GCN. It was written for all but it can much more effectively use extensions to deliver the better performance from AMD's GCN cards. It's very misleading to construe that Vulkan (in Doom) was written for AMD cards. It simply uses their hardware better for what they wanted to do with the graphics engine.

No, simple truth is, the Vulkan pathway in Doom is better used by GCN. The coding ID did for the game means they can use GCN in a far more effective way than Open GL can manage as a standalone. It's as @HD64G says, AMD are piss poor at Open GL compared to Nvidia but the GCN hardware just runs butter on Vulkan. And I mean butter - as in smooth.

Current and future consoles are GCN based.

The new Nintendo thingy-ma-jig is Tegra.
 
Joined
Aug 27, 2015
Messages
555 (0.18/day)
Location
In the middle of nowhere
System Name Scrapped Parts, Unite !
Processor Ryzen 5 3600 @4.0 Ghz
Motherboard MSI B450-A Pro MAX
Cooling Stock
Memory Team Group Elite 16 GB 3133Mhz
Video Card(s) Colorful iGame GeForce GTX1060 Vulcan U 6G
Storage Hitachi 500 GB, Sony 1TB, KINGSTON 400A 120GB // Samsung 160 GB
Display(s) HP 2009f
Case Xigmatek Asgard Pro // Cooler Master Centurion 5
Power Supply OCZ ModXStream Pro 500 W
Mouse Logitech G102
Software Windows 10 x64
Benchmark Scores Minesweeper 30fps, Tetris 40 fps, with overheated CPU and GPU
....aaannd Nvidia PR be like "this game is not reliable benchmark in DX12 mode, please do not include this game in your GPU review"
 
Joined
Apr 18, 2013
Messages
1,260 (0.32/day)
Location
Artem S. Tashkinov
Just a hint, with OpenGL, NVIDIA also has it's own extensions and no one seems to have problems with that. But oh noes, Vulkan uses AMD specific goodies. Uh?

What OpenGL games? Aside from id Software only Indie devs use OpenGL. And Indies run so fast no one cares.

It was written for all but it can much more effectively use extensions to deliver the better performance from AMD's GCN cards.

No, simple truth is, the Vulkan pathway in Doom is better used by GCN.

You contradict my statement but then prove the opposite. Twice. Great.

I guess you are that person who hates every released PhysX/NVIDIAworks based game. When something is optimized for NVIDIA, NVIDIA are bloody cheaters.

When something is optimized from the get go for GCN then 1) NVIDIA cannot run it all 2) NVIDIA cannot develop good GPUs 3) NVIDIA sucks this and that. Double standards all the f*cking time.

Meanwhile no AMD fanatic is concerned that AMD's OpenGL drivers generally suck.
 
Last edited:

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.96/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
What OpenGL games? Aside from id Software only Indie devs use OpenGL. And Indies run so fast no one cares.
Any game that runs on OS X has OpenGL support. That does include Blizzard which I would hardly call Indie.
 
Joined
Nov 1, 2008
Messages
4,213 (0.75/day)
Location
Vietnam
System Name Gaming System / HTPC-Server
Processor i7 8700K (@4.8 Ghz All-Core) / R7 5900X
Motherboard Z370 Aorus Ultra Gaming / MSI B450 Mortar Max
Cooling CM ML360 / CM ML240L
Memory 16Gb Hynix @3200 MHz / 16Gb Hynix @3000Mhz
Video Card(s) Zotac 3080 / Colorful 1060
Storage 750G MX300 + 2x500G NVMe / 40Tb Reds + 1Tb WD Blue NVMe
Display(s) LG 27GN800-B 27'' 2K 144Hz / Sony TV
Case Xigmatek Aquarius Plus / Corsair Air 240
Audio Device(s) On Board Realtek
Power Supply Super Flower Leadex III Gold 750W / Andyson TX-700 Platinum
Mouse Logitech G502 Hero / K400+
Keyboard Wooting Two / K400+
Software Windows 10 x64
Benchmark Scores Cinebench R15 = 1542 3D Mark Timespy = 9758
Is vulcan a closed system like PhysX and hair works? What are the barriers for NVIDIA using this game?

Is the code badly optimized? That is, does it force a card to render things that barely improve the game, visually, yet are a severe detriment to performance on NVIDIA cards?

I guess you are that person who hates every released PhysX/NVIDIAworks based game. When something is optimized for NVIDIA, NVIDIA are bloody cheaters.

When something is optimized from the get go for GCN then 1) NVIDIA cannot run it all 2) NVIDIA cannot develop good GPUs 3) NVIDIA sucks this and that. Double standards all the f*cking time.

Meanwhile no AMD fanatic is concerned that AMD's OpenGL drivers generally suck.

Is that a general rant or did someone say that in this thread?
 
Last edited:
Joined
Apr 18, 2013
Messages
1,260 (0.32/day)
Location
Artem S. Tashkinov
Any game that runs on OS X has OpenGL support. That does include Blizzard which I would hardly call Indie.

Most if not all MacOS games, which were originally released for Windows, run via a D3D to OpenGL translator of some sort.

hair works

Hair works uses standard D3D APIs - nothing NVIDIA specific. This argument is invalid. Now that AMD has reimplemented tesselation in Polaris and the Witcher runs fine even with HairWorks on on AMD everyone has magically forgotten about this fact.

And if I'm not mistaken W1zzard doesn't use PhysX enabled games in his testbed (or disables it completely). So PhysX argument is equally invalid.
 
Joined
Nov 1, 2008
Messages
4,213 (0.75/day)
Location
Vietnam
System Name Gaming System / HTPC-Server
Processor i7 8700K (@4.8 Ghz All-Core) / R7 5900X
Motherboard Z370 Aorus Ultra Gaming / MSI B450 Mortar Max
Cooling CM ML360 / CM ML240L
Memory 16Gb Hynix @3200 MHz / 16Gb Hynix @3000Mhz
Video Card(s) Zotac 3080 / Colorful 1060
Storage 750G MX300 + 2x500G NVMe / 40Tb Reds + 1Tb WD Blue NVMe
Display(s) LG 27GN800-B 27'' 2K 144Hz / Sony TV
Case Xigmatek Aquarius Plus / Corsair Air 240
Audio Device(s) On Board Realtek
Power Supply Super Flower Leadex III Gold 750W / Andyson TX-700 Platinum
Mouse Logitech G502 Hero / K400+
Keyboard Wooting Two / K400+
Software Windows 10 x64
Benchmark Scores Cinebench R15 = 1542 3D Mark Timespy = 9758
[
Most if not all MacOS games, which were originally released for Windows, run via a D3D to OpenGL translator of some sort.



Hair works uses standard D3D APIs - nothing NVIDIA specific. This argument is invalid. Now that AMD has reimplemented tesselation in Polaris and the Witcher runs fine even with HairWorks on on AMD everyone has magically forgotten about this fact.

And if I'm not mistaken W1zzard doesn't use PhysX enabled games in his testbed (or disables it completely). So PhysX argument is equally invalid.

I wasn't really making an argument. I'm not that well versed in the NVIDIA vs. Amd battle, but isn't it generally about one side doing something to negatively affect the performance on their competitors cards?
 
Joined
Apr 18, 2013
Messages
1,260 (0.32/day)
Location
Artem S. Tashkinov
I wasn't really making an argument. I'm not that well versed in the NVIDIA vs. Amd battle, but isn't it generally about one side doing something to negatively affect the performance on their competitors cards?

This thread was created to calm WCCFtech fanatics who have found their way to TPU. Every Pascal/Polaris review on TPU has at least three very vocal people requiring Doom to be included in the gaming stand or "this review is completely flawed and I don't trust TPU any longer".
 
Joined
Oct 2, 2004
Messages
13,791 (1.94/day)
There is no such thing as "written for GCN". Vulkan and DX12 are standardized API's. And AMD is simply better at it. Probably has something to do with the fact this is their 4th generation of doing async capable GPU's. It's funny how everyone quickly blames AMD for playing dirty for dominating NVIDIA in Vulkan, but when NVIDIA dominated AMD in OpenGL, it' was just a "fact" and no one ever argued it...
 
Joined
Jan 2, 2012
Messages
1,079 (0.24/day)
Location
Indonesia
Processor AMD Ryzen 7 5700X
Motherboard ASUS STRIX X570-E
Cooling NOCTUA NH-U12A
Memory G.Skill FlareX 32 GB (4 x 8 GB) DDR4-3200
Video Card(s) ASUS RTX 4070 DUAL
Storage 1 TB WD Black SN850X | 2 TB WD Blue SN570 | 10 TB WD Purple Pro
Display(s) LG 32QP880N 32"
Case Fractal Design Define R5 Black
Power Supply Seasonic Focus Gold 750W
Mouse Pulsar X2
Keyboard KIRA EXS
Is vulcan a closed system like PhysX and hair works? What are the barriers for NVIDIA using this game?

It's not closed to the registered hardware vendors and contributors, Vulkan is like OpenGL, any registered IHV (Independent Hardware Vendor) can provide specific extensions for their hardware.

The problem for NVIDIA is because Vulkan is based on Mantle (developed by AMD), AMD is much more prepared with their hardware (related to Async Compute) and vendor specific extensions so developer can utilize their stuff from the get go, they don't need to wait for driver or extension update.
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,378 (2.37/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
You contradict my statement but then prove the opposite. Twice. Great.

I guess you are that person who hates every released PhysX/NVIDIAworks based game. When something is optimized for NVIDIA, NVIDIA are bloody cheaters.

When something is optimized from the get go for GCN then 1) NVIDIA cannot run it all 2) NVIDIA cannot develop good GPUs 3) NVIDIA sucks this and that. Double standards all the f*cking time.

Meanwhile no AMD fanatic is concerned that AMD's OpenGL drivers generally suck.

Lol. Calm it, antsy pants.

My recent run is Titan (original), 780ti Classy (x2) and now Kingpin 980ti with modded Bitspower water block. I'm a goddamned Nvidia funder (that assemblage is over £3000 worth).

You're still incorrect about Vulkan on Doom being written for AMD. The code was created and to implement for AMD it was easy to apply the extensions to utilise the GCN hardware. Nvidia don't have the same hardware to better utilise low level API's. The ID software guy said they like to use Asynchronous compute and will use it more in future. Not because it favours AMD but because it's easier and better for them.
Nvidia handles everything now equally. AMD doesn't. All low level API'S have an inherent bias towards specialist hardware, currently found in GCN. Well done AMD for being there.
Still, the sheer power of Pascal means the lower transistored GTX1080 still reigns supreme using OpenGL.

No, I'm no fanboy and I attack and defend both parties. Ironically both sides call me a Fanboy, so I must be neutral by matter of conflict resolution.
 
Joined
Oct 2, 2004
Messages
13,791 (1.94/day)
That's like saying DX12 is more prepared for AMD because the core idea around it was also Mantle. That's nonsense. AMD is simply better because they've been preparing for this moment for years. And they are harvesting the fruits of their long and back then greatly underappreciated hard work.

@the54thvoid
When you piss both sides off, you're doing something right.
 
Joined
Oct 2, 2015
Messages
2,986 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) Gigabyte G27Q + AOC 19'
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse Logitech G203
Keyboard VSG Alnilam
Software Windows 11 x64
Little useless information, the first vendor specific extension for Vulkan was released for Nvidia only hardware: http://www.phoronix.com/scan.php?page=news_item&px=Vulkan-1.0.5-Released It's only to make porting easier.

AMD just took the technological lead, like with AMD64 (x64). There is no favoritism from the developers of Doom.
And yes, OpenGL drivers in AMD cards sucks, just try for example Dolphin Emulator and compare the performance of AMD and Nvidia cards in OpenGL and DirecX11, this is a good test because the Dolphin developers used almost all vendor specific extensions.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.96/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Most if not all MacOS games, which were originally released for Windows, run via a D3D to OpenGL translator of some sort.
That's quite the generalization by assuming they all do some form of translation. :confused:
 
Joined
Oct 2, 2004
Messages
13,791 (1.94/day)
Little useless information, the first vendor specific extension for Vulkan was released for Nvidia only hardware: http://www.phoronix.com/scan.php?page=news_item&px=Vulkan-1.0.5-Released It's only to make porting easier.

AMD just took the technological lead, like with AMD64 (x64). There is no favoritism from the developers of Doom.
And yes, OpenGL drivers in AMD cards sucks, just try for example Dolphin Emulator and compare the performance of AMD and Nvidia cards in OpenGL and DirecX11, this is a good test because the Dolphin developers used almost all vendor specific extensions.

At this point no one even argues about OpenGL on Radeons. But to be realistic, OpenGL wasn't actually a real issue for Radeon either. I've played UT99, Deus Ex (both with D3D10 equivalent in OGL with custom renderer), Q3A, Doom 3, Quake 4 and lastly Rage on Radeon graphic cards. Sure, old games for today's standards. But back then, I was playing that on Radeon 9600 Pro, later x1950 pro and several variants of HD4000, HD5000, HD6000, HD7000 series. Have I ever felt like "omg, this OpenGL really sucks on Radeon"? Nope, never. Maybe it was worse when you look at the fancy graphs, but in real world conditions, there was no difference worth mentioning.
 
Joined
Oct 2, 2015
Messages
2,986 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) Gigabyte G27Q + AOC 19'
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse Logitech G203
Keyboard VSG Alnilam
Software Windows 11 x64
At this point no one even argues about OpenGL on Radeons. But to be realistic, OpenGL wasn't actually a real issue for Radeon either. I've played UT99, Deus Ex (both with D3D10 equivalent in OGL with custom renderer), Q3A, Doom 3, Quake 4 and lastly Rage on Radeon graphic cards. Sure, old games for today's standards. But back then, I was playing that on Radeon 9600 Pro, later x1950 pro and several variants of HD4000, HD5000, HD6000, HD7000 series. Have I ever felt like "omg, this OpenGL really sucks on Radeon"? Nope, never. Maybe it was worse when you look at the fancy graphs, but in real world conditions, there was no difference worth mentioning.

In known games it just works, but if you are testing specific declared extensions (the driver report the hardware is compatible with them), they just don't work, crash, etc.
Here are some examples: https://es.dolphin-emu.org/blog/2013/09/26/dolphin-emulator-and-opengl-drivers-hall-fameshame/
 
Joined
Mar 18, 2008
Messages
5,717 (0.98/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
I wonder what will the fan boys respond when nvidia shift to ALU heavy structure in Volta. Welp they are probably gonna switch their stance immediately to praise DX12/Vulkan.

In a fanboy's world if nvidia is doing bad it is AMD to blame some how.
 
Joined
Oct 2, 2004
Messages
13,791 (1.94/day)
It's an indie emulator. Do you seriously believe they'll invest driver team time into it compared to games sold in millions of copies on a market that heavily depends on a functional ecosystem? I highly doubt that. So, if it's broken, it's just how it is. I'm not making excuses for AMD, I'm just being realistic here. Same reason why AMD never bothered to invest a lot of time and resources into fixing OpenGL, because they knew what they were preparing and working. Mantle, the predecessor of Vulkan and DX12. Considering they are company for profit, they invest where most profit is expected. OpenGL apparently wasn't all that big of an issue after all to be worthy bothering with it. If we are honest, whole thing has been overblown greatly. Sure, it might perform worse than NVIDIA, but there is always something one is a bit better than the other.
 
Top