• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

DOOM with Vulkan Renderer Significantly Faster on AMD GPUs

Yeah, it's the same driver from Windows 7. WDDM 1.1 drivers can run on Windows 10, just as WDDM 1.2 drivers can as well. I bet you can go on AMD's website, download the old driver for Windows 7, then use hardware manager to install the display driver directly. None of this means that the hardware is still being supported though, it just means that old drivers for Windows 7 still work on Windows 10.[/QUOTE]

Ok so again - find me a game that isn't working.
 
Ok so again - find me a game that isn't working.
I'm not saying that it will or will not work okay with whatever game you throw at it, my only point is that no updates means no active support, period, end of story. I've said nothing beyond that. I also don't have time to cobble together a machine with my old 2600 XT to figure that out because, like AMD, I don't really care anymore because I have several newer GPUs I can use first, such as one of my old 6870s. Also, there are some platforms where this isn't the case such at Linux. For example, I can't install FGLRX on Ubuntu 14.04, 15.10, or 16.04 for my old 2600 XT and I can't install FGLRX on my old laptop with a Mobility Radeon HD 3650 in it. Just because there is backwards compatibility in the places you care about it and it seemingly works does when you use it does not mean it's still actively supported.

Example, I'm using a Vista driver for the USB wi-fi card in my machine. They haven't released an update for a very long time but, I can still install the driver and run at a full 300Mbit without an issue despite the fact that it's an old driver. Same deal with the ASMedia and C-Media AHCI drivers for my motherboard's 3rd party SATA and eSATA controllers. It doesn't need to be supported to work but, if something does break, don't expect it to get fixed.
 
I'm not saying that it will or will not work okay with whatever game you throw at it, my only point is that no updates means no active support, period, end of story. I've said nothing beyond that. I also don't have time to cobble together a machine with my old 2600 XT to figure that out because, like AMD, I don't really care anymore because I have several newer GPUs I can use first, such as one of my old 6870s. Also, there are some platforms where this isn't the case such at Linux. For example, I can't install FGLRX on Ubuntu 14.04, 15.10, or 16.04 for my old 2600 XT and I can't install FGLRX on my old laptop with a Mobility Radeon HD 3650 in it. Just because there is backwards compatibility in the places you care about it and it seemingly works does when you use it does not mean it's still actively supported.

Example, I'm using a Vista driver for the USB wi-fi card in my machine. They haven't released an update for a very long time but, I can still install the driver and run at a full 300Mbit without an issue despite the fact that it's an old driver. Same deal with the ASMedia and C-Media AHCI drivers for my motherboard's 3rd party SATA and eSATA controllers. It doesn't need to be supported to work but, if something does break, don't expect it to get fixed.

Well I think we are on the same page then. My entire point though, is that anyone trying to say Nvidia's cards last longer due to "Driver support" is full of BS. This is a non-point that derailed the conversation.

1) Things like newer OS' and DX/Vulkan/etc become a far greater compatibility issue long before any of these drivers would be.

2) Like you said - you (and I as well) run plenty of things with old drivers that work perfectly fine. Once a driver is "perfected" there is nothing left for these companies to do - it will work fine. Furthermore, anyone who says "It could be an issue" should at least have anecdotal evidence before they even pretend it is a thing.
 
You made it sound like AMD doesn't have any drivers at all and Nvidia is still making sure 8800 GTX's can srun BF1.


How you infer anything about nvidia from a simple "No" followed by a link from amd (stating support has ended and no future driver releases are planned) is beyond me.
 
So what does any of this have to do with vulkan?
 
Two points.
1) That park has green apples and red apples. Don't be so naive to think a company as large as Nvidia is 'out'.
2) On Vega. The GTX1080 smokes everything and it's only the 980 replacement. The GP100/102 chip is 'rumoured' 50% faster. That is Vegas competition.

Even without compute, Pascal's highly optimised and efficient core can run DX12 and Vulcan just fine. I'll wager with you, £10, through PayPal that Vega won't beat the full Pascal chip.
If I lose, I'll be happy because it should start a price war. If I win, I'll be unhappy because Nvidia prices will reach the stratosphere.

OK, let's make clear what I wrote above then. I mean for the same price lvl that Vega will have a party on nVidia GPUs. Which will probably be 1080 imho. Navi might be the competition for next Titan, so Vega will go against 1080 (Vega 10) and 1080Ti (Vega 11). Except for power consumption Vega will be a monster for DX12 and Vulcan. nVidia will need another architecture to compete that imho.
 
Last edited:
... For example, I can find a linux driver for GeForce 6 series (12 years old) updated in Nov 2015. And that gives me confidence when buying.

Good luck using that driver on a current kernel and X.org. On DX10 and older hardware you're stuck with nouveau, unless you install an old and unsupported distro. Same thing with ATI/AMD Terascale hardware, you only have the free driver as an option.

How is the jump in performance for Kepler when using Vulkan?
 
This dude's video explains where the extra performance comes from:


The relevant part starts @ around 8:46.

The reason AMD cards are gaining so much is not so much because Vulkan is so much better but rather because AMD's OGL is so damn worse then nVidia's: look @ the CPU overhead on both camps with OGL. It explains why nVidia's gains are so much lower: there' much less room for improvement with nVidia.
 
Same situation as DX11, Nv's driver overhead was already great so they showed less gains unsurprisingly.

People are essentially celebrating mediocrity.
 
This dude's video explains where the extra performance comes from:


The relevant part starts @ around 8:46.

The reason AMD cards are gaining so much is not so much because Vulkan is so much better but rather because AMD's OGL is so damn worse then nVidia's: look @ the CPU overhead on both camps with OGL. It explains why nVidia's gains are so much lower: there' much less room for improvement with nVidia.

Doesn't bold well for Nvidia when saying that. Looking at how a 480 can creap up to there 1070.

Both are still rendering the same thing and AMD lower price tier cards are jumping 1 maybe 2 spots. While Nvidias are staying stagnant.
 
Last edited:
Doesn't bold well for Nvidia when saying that. Looking at how a 480 can creap up to there 1070.

Yeah, based on one game that hammers along regardless of Vulkan (at least on Nv hardware).

We get it, Nvidia are DOOOOOOOOOOOOMED!
 
The reason AMD cards are gaining so much is not so much because Vulkan is so much better but rather because AMD's OGL is so damn worse then nVidia's: look @ the CPU overhead on both camps with OGL. It explains why nVidia's gains are so much lower: there' much less room for improvement with nVidia.
Sure but, for just a bit of speculation. GPU vendors do their own OpenGL implementation and I'm willing to bet that nVidia merely was already batching draw calls under the hood by putting them in a queue or something and flushing it when the right OpenGL command came along where AMD probably was just doing the draw call on the spot. I mean, the CPU number is higher, sure but, I'm willing to bet that that huge number by "GPU" is an indicator that the machine is going gung-ho on calls that impact the GPU. I wouldn't be surprised if this was something nVidia simply was doing in driver space to smooth out the actual calls to the API so the application can more quickly get to the next thing it was going to do. Just a thought as this is the kind of thing I would do if I needed to get latency down and what was happening could be done later but, still in order. Queues are great for that kind of thing.

You know, this entire thing could merely be the difference between nVidia loosely following the specification but, well enough for everything to work while doing things under the hood to make it go faster versus AMD who could have strictly implemented the specification and suffers the consequences as a result. It's an interesting thought, that's for sure.
 
Last edited:
Sure but, for just a bit of speculation. GPU vendors do their own OpenGL implementation and I'm willing to bet that nVidia merely was already batching draw calls under the hood by putting them in a queue or something and flushing it when the right OpenGL command came along where AMD probably was just doing the draw call on the spot. I mean, the CPU number is higher, sure but, I'm willing to bet that that huge number by "GPU" is an indicator that the machine is going gung-ho on calls that impact the GPU. I wouldn't be surprised if this was something nVidia simply was doing in driver space to smooth out the actual calls to the API so the application can more quickly get to the next thing it was going to do. Just a thought as this is the kind of thing I would do if I needed to get latency down and what was happening could be done later but, still in order. Queues are great for that kind of thing.

You know, this entire thing could merely be the difference between nVidia loosely following the specification but, well enough for everything to work while doing things under the hood to make it go faster versus AMD who could have strictly implemented the specification and suffers the consequences as a result. It's an interesting thought, that's for sure.

The GPU numbers on AMD's cards are bugged: even the video poster says so. The CPU numbers seem accurate, though.

There's a good portion of AMD's card resources that go unused, which are now being tapped by Vulkan. The resources have been there the whole time but AMD has been unable to put them to good use. What Vulkan shows us is how much better the cards CAN BE if their resources are properly managed. In this regard, nVidia has been WAY more efficient.
 
The GPU numbers on AMD's cards are bugged: even the video poster says so. The CPU numbers seem accurate, though.

There's a good portion of AMD's card resources that go unused, which are now being tapped by Vulkan. The resources have been there the whole time but AMD has been unable to put them to good use. What Vulkan shows us is how much better the cards CAN BE if their resources are properly managed. In this regard, nVidia has been WAY more efficient.

This is what happens when game coders are fucking lazy... There is a reason the amd cards rape when used for real things... They are more powerful period...
 
This is what happens when game coders are fucking lazy... There is a reason the amd cards rape when used for real things... They are more powerful period...
This. If AMD had the money to throw at devs the way nvidia have over the years things would look MUCH different. But thanks to the time and effort AMD had put into Mantle we may well see a major revolution in games. Provided of course game devs use the tools GIVEN to them.
 
Lol, Vulkan isn't "biased". AMD GPU's are just more advanced when it comes to more direct GPU access (that Vulkan and DX12 allow), the fact they weren't shining is because software wasn't taking any advantage of all that yet. Till now. I mean, AMD had partial async compute since HD7000 series and full in R9 290X. NVIDIA still doesn't have even partial in GTX 1080 from the looks of it. Async is when you ca seamlessly blend graphics, audio and physics computation on a single GPU. Something AMD was aiming basically the whole time since they created GCN. They support graphics, they've added audio on R9 290X and they've been working with physics for ages, some with Havok and some with Bullet.

R9 Fury X users don't feel that let down anymore :p In fact R9 Fury cards in general shine in DX12 and apparently also in Vulkan. While I love my GTX 980 I kinda regret I haven't gone with R9 Fury/Fury X.

Also, for people saying "async emulation", there is no such thing, either you have hardware implementation or you don't. You can't emulate a feature that's sole purpose of it is massive performance boost through seamless connection of graphics and compute tasks. This is the same as emulation of pixel shaders when they became a thing with DirectX 8. Either you had them or you didn't. There were some software emulation techniques, but they were so horrendously slow it just wasn't feasible to use in real-time rendering within games. Async is no different. And NVIDIA apparently doesn't have it. Which kinda sucks when you pay 700+ € for a brand new graphic card...



I guess that's how NVIDIA fanboys are comforting themselves after buying super expensive GTX 1000 series graphic card (or GTX 900) that sucks against last generation of AMD cards that weren't particularly awesome even back then. "uh oh it doesn't lose any performance". Well, you also gain none. What's the point then? The whole point of Vulkan/DX12 is to boost performance. When devs will cram more effects into games assuming all these gains, your performance will actually tank where AMD's will remain unchanged. How will you defend NVIDIA then?



ummmm....yeah. If that rant against even former generation 900 series cards isn't you showing your AMD fanoyism I don't know what is. You're simply wrong the 980/ti beat the competition hands down the ONLY thing you said right is "against last generation of AMD cards that weren't particularly awesome". There you are absolutely right, AMD wasn't particularly awesome the gtx 980 matched or beat their top card in everything UP to 2k resolutions and was cheaper and the ti destroyed it at every resolution...how is that possibly objectively being Nvidia "sucks" versus last gen AMD? If I owned a fury I'd say same thing you're just not right and ranting.
 
Am I the only one wondering why there's not a single 1080 card on that list and only a 1070?

Yep, I saw it straight away and wondered why they'd leave it off except that it was right up there with the AMD cards so couldn't possibly show that.
 
If Vulkan is so biased towards AMD cards, I doubt any major developers will risk lower sales just to make games run faster on AMD by offering renders based solely on Vulkan.
Gratz !!! you won clueless poster award.
Blizzard loves mac support... so they'll Vulkan, Valve helped make Vulkan... so they'll use it and Bethesda is using it. That just leaves EA and ubisoft.
Edit: Vulkan will also be the gfx api for android. what harms Valkan harms Nvidia shield and there push into andriod
 
Last edited:
Doom was made by id Software. Id used OpenGL since forever. Id using Vulkan was kind of inevitable.
 
All in all a healthy developement.
And yes, my eldest son gained well over 30fps on his amd 290 card with Doom.
I would say vulcan is here to stay and replace opengl. Consoles will be using it happily also, squeezing some more performance out of them boxes for some more years to come.

But in any case, nvidia was already highly optimized for many games, and now amd is catching up. Healthy competition nvidia company should be happy with.
Because it would certainly not be in their interest to be the sole provider of graphic chips.
Game on...
 
All in all a healthy developement.
And yes, my eldest son gained well over 30fps on his amd 290 card with Doom.
I would say vulcan is here to stay and replace opengl. Consoles will be using it happily also, squeezing some more performance out of them boxes for some more years to come.

But in any case, nvidia was already highly optimized for many games, and now amd is catching up. Healthy competition nvidia company should be happy with.
Because it would certainly not be in their interest to be the sole provider of graphic chips.
Game on...

?

AMD only exists so intel can say "look, there's still a competitor". Nvidia works the same way. Both could undercut AMD so bad that they'd file for bankruptcy and a sale would be started in a few months.

Both are also probably scared that a juggernaut with deep pockets would acquire all the IP and put them under (like Samsung). Keeping AMD barely alive is highly profitable for these schmucks.
 
?

AMD only exists so intel can say "look, there's still a competitor". Nvidia works the same way. Both could undercut AMD so bad that they'd file for bankruptcy and a sale would be started in a few months.

Both are also probably scared that a juggernaut with deep pockets would acquire all the IP and put them under (like Samsung). Keeping AMD barely alive is highly profitable for these schmucks.

AMD is at 30% for GPU's and gaining market share I wouldn't count them out in that market Vulkan and DX12 as well as the bitcoining craze painted them in a good light.
 
AMD's future is riding on Zen more so than anything else.
 
AMD's future is riding on Zen more so than anything else.
It depends on AMD not making another catastrophe but, I fear that no progress will change the direction AMD is going. Most of the things they've been doing as of late is adequate but, it will take a huge push for them to actually make progress. Intel is a juggernaut, much more so than nVidia. I'm hoping Xen will be good but, we can't kid ourselves when you consider the possible R&D budgets between the two companies. You're talking about AMD, which takes in the hundreds of millions of USD in revenue a year versus Intel who takes in several tens of billions of USDs in revenue a year. Granted, Intel's breadth in the market is much wider than AMD's but, it only shows the vast difference in capability between the two companies. I love AMD but, I think the sad reality is that AMD lost this war a long time ago and it's only a matter of time until they get relegated to the likes of companies like VIA... what a sad day that will be. :(
 
Back
Top