• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why AMD will perform better than NVIDIA in DirectX 12

Status
Not open for further replies.
Remember how Pirelli tire sales dropped when people heard they suck in F1? Even though F1 high end tires have no connection what so ever to the road tires...
 
I think everyones just losing sight of the fact they're talking about current gen cards, right now - the only people that benefit are the ones who got lucky with a high end AMD card, since their investment drags out a little longer before needing an upgrade.

Yup. Hawaii and Fiji will have great longevity in a DX12 environment.

But what people are really missing is this (and I'll repost the graph that has been reposted already).

DX12-Batches-4K-4xMSAA.png


If Nvidia Maxwell architecture cant handle Async (and this is an Async utilising bench) why is it on par with an Async monster? Fury X and 980ti are equal cost (or close enough - in UK 980ti are cheaper or more expensive, custom card dependent). Fiji has a shit load of shading power, in DX 12 it's showing it's true legs. But, and this is a huge freaking but - Maxwell (the most awful Async hardware ever apparently) is still on par with Fiji.

So - Fiji is more awesome than Maxwell at DX12 because of Async shaders but in an Async shader using benchmark (don't even know how many queues AoS uses???) the two cards are pretty much equal (and in fact, Fiji's lead drops at 'heavy' - maybe clock speed there helping NV?)

So what AMD folksies should be very worried about is that a crippled DX11 focused architecture is matching a DX12 designed architecture on a benchmark from the guys that leveraged Mantle for AMD.

Think about it....... If this is AMD's best shot (and they've been very vocal about how bad Async on Maxwell is) why is it only achieving a tiny lead (if any)?

This isn't me banging NV's drum but the penny just dropped when i started writing: crippled card performs near to or equal a card designed to run it.........
 
So is everyone just assuming that ACE's are going to be the staple standard "moar horses" for DX12 performance?

Seems odd, considering nobody gave a damn up until AMD started shouting about how they have more.
 
Seems odd, considering nobody gave a damn up until AMD started shouting about how they have more.
Ahhh, the AMD PR machine at work.
 
AMD always has more.

Mostly more units sitting on shelves. Let's leave it at that and wait it out. This is pointless.
 
i think mussels hit the nail on the head, but some people refuse to think amd are able to have any kind of plan.

they were in a place to make sure they had the best hardware to make use of what has become dx12. nvidia could of done this but the idea of giving software away is in their past.

will this async stuff damage nvidia for dx12? do they already have a plan to sell cuda 4 to devs which will do most of the stuff and "more"?

time will tell.

i can say this, after nearly 2 years with a 290x that keeps gaining fps with driver updates i think it was about the best £330 i ever spent on a gpu :D
 
i think mussels hit the nail on the head, but some people refuse to think amd are able to have any kind of plan.

they were in a place to make sure they had the best hardware to make use of what has become dx12

Agreed that it is only turds that think AMD engineers are useless (just the management). I imagine AMD winning the consoles ('winning' here is subjective) gave them the impetus to rally against the DX11 shortcomings and try to develop more towards an architecture that was driver agnostic. In doing so, with their limited pool of R&D resources, they worked on Mantle to deliver that driver minimising code to allow the bare metal to be used without complex and game specific optimisation (what NV do well, until they break it). Mantle would be seen as favourable as a code path as it relied less on gfx vendors writing their own code and let devs create a nice game (yes I'm simplifying things a bit here)
Reality hits though and with NV's market share Mantle doesn't look as though it's going to gain traction. All the while MS is working at a bare metal API and slowly DX12 is born. Blah blah Vulcan this, Kronos that, et voila, open source non proprietary wholesome goodness.

Of course AMD has a plan - they had it for years. And it's working to massively level the playing field - taking away NV's software (driver) optimisation edge.

Still, it's only leveled it for now, with one benchmark. What if queue depths are kept under 31 and NV uses master rasterisation gimmicks or whatever? Do we seriously expect Nv to play fair? Of course not. I might buy Nvidia gfx cards (and kill baby seals and eat kittens cos we're all evil) but i'm also aware how quickly NV will work with Dev's to ensure AAA titles they're involved with absolutely take that edge away from AMD. Is that fair? NO. But it's business.

It will be very interesting to watch this play out but as is repeatedly being pointed out and people seem to ignore, AoS (from the guys that gave you the first Mantle showpiece) using Async only makes it level. Is this AMD's (Fiji's) best scenario? If so, Nvidia will be licking their lips for their turn. And we should all be scared because it will be a Gameworks apocalypse :fear:
 
Pretty sure by the time Pascal comes out, they will beat AMD in DX12. And there should be some DX12 games to compare by then.
 
Pretty sure by the time Pascal comes out, they will beat AMD in DX12. And there should be some DX12 games to compare by then.

Even if they do, and that's a big if, since AMD has had a good headstart on designing hardware for DX12, as opposed to DX 11, it's not a big deal either way. You see the graph, based on the ONE benchmark/game out now, that shows the the Nvidia top dog, not designed for DX12, and the AMD top dog, designed for DX12, are not very far apart.

I foresee AMD improving, and Nvidia having Pascal designed for DX 12, and that there won't be a heck of alot of difference between the competitors at most price points. Personally, I think it's a helluva alot of noise being made on both sides about the issue right now, when there is one DX12 game and only a handful of DX12 games by the time Pascal drops.

I don't think either side's strident and zealous believers can really claim anything at this point.
 
I don't like it when AMD makes their products for the future, they did it with bulldozer... I don't buy my PC parts to get better gains a yr or two in the future, I want them now. Intel and NVidia seems to understand this or AMD just simply has no other choice but to do business this way with their limited budget.
 
I use an AMD logo as an avatar because I respect the company.
Sorry I kinda find it hard to respect a company that claims things which turn out to be lies or not as good as their boast.

Crazy juice? I'm not the one in denial about the importance of asynchronous shaders and nVidia's lack of them.
What you expect when AMD create async shaders so they are gonna be better at it them nvidia to start with. Want to make that kinda comparison should start turning on Physx and hair works in game testing as it would make testing fair. I doubt that async is allowed to be turned off to remove what was originally AMD design. Likely won't be many games that really use it to start with cept for few AMD backed games but with it off it would should a true Apples to Apples test instead of Apples to oranges since it involves AMD tech.
 
pls use edit and multi-quote options. dobule posting isnt tolerated, triple even less.

I'm afraid I don't understand. I was replying to three different people, not the same person three times. That's my understanding of a double or triple post. Are you saying I can't respond to more than one person at a time? If so, that's quite the limitation, and will invariably lead to several people ganging up on one person.
 
I'm afraid I don't understand. I was replying to three different people, not the same person three times. That's my understanding of a double or triple post. Are you saying I can't respond to more than one person at a time? If so, that's quite the limitation, and will invariably lead to several people ganging up on one person.

No, simply use the multi-quote button, or you can simply respond to multiple people in turn using the @user name.
 
No, simply use the multi-quote button, or you can simply respond to multiple people in turn using the @user name.
Kudos on finding the user who is user. He is apparently from Moscow.
 
I'm afraid I don't understand. I was replying to three different people, not the same person three times. That's my understanding of a double or triple post. Are you saying I can't respond to more than one person at a time? If so, that's quite the limitation, and will invariably lead to several people ganging up on one person.
those are your 3 posts after mod have fixed em.
if you had used multi-quote option or if you had quoted next person after you have answered to previous one in the same post it would looked like this. everyone that you have quoted would receive alert that he has been quoted and non of them would have missed that you have responded personally to him
 
Last edited:
Kudos on finding the user who is user. He is apparently from Moscow.

WOW! I didn't even realize that when I did it!! :laugh: Hopefully that person doesn't get alot of callouts in threads from now on.
 
It will be very interesting to watch this play out but as is repeatedly being pointed out and people seem to ignore, AoS (from the guys that gave you the first Mantle showpiece) using Async only makes it level. Is this AMD's (Fiji's) best scenario? If so, Nvidia will be licking their lips for their turn. And we should all be scared because it will be a Gameworks apocalypse :fear:
I'm a-scared....but the division between games due to vendor prioritization is only part of it. We are told that DX12 places more power in the hands of the developer, and the onus to have the developer get it right - to get the coding working first time since the graphics driver has a more limited ability to influence the final product.

Bearing that in mind, and the rise of game studios/publishers rushing out titles that are less than polished, does anyone have any faith that the EA's and Ubisoft's of the industry are up to the extra workload?
 
I'm a-scared....but the division between games due to vendor prioritization is only part of it. We are told that DX12 places more power in the hands of the developer, and the onus to have the developer get it right - to get the coding working first time since the graphics driver has a more limited ability to influence the final product.

Bearing that in mind, and the rise of game studios/publishers rushing out titles that are less than polished, does anyone have any faith that the EA's and Ubisoft's of the industry are up to the extra workload?

Not only them, but how about the hundreds of indie devs. Games are going to be more costly to make. And this is yet another reason for slower DX12 adoption rates. I keep saying this, because logic seems lost on way too many people over here.
 
Not only them, but how about the hundreds of indie devs. Games are going to be more costly to make. And this is yet another reason for slower DX12 adoption rates. I keep saying this, because logic seems lost on way too many people over here.

I'm with you, and been saying it as well. The number of DirectX 12 games is not going to grow at any exponential rates this first year as some keep predicting. W10 is only hovering around 20%. lilhasselhoffer remarked, correctly I think, that the initial 16% in the first month was that quick rush to partake in new hardware, yet despite being free, the adoption rate has slowed.

Game manufacturers are going to concentrate on the majority for at least the next year, which right now is using DirectX 11. Hell, look at all the huge games released this year, and even post W10: DirectX 11. Upcoming AAA titles also, DirectX 11.

So while AMD appears to perform better in DirectX 12 now, Nvidia will be following suit in the Spring with DirectX 12 GPU's, and be performing presumably at the same level by the time any meaningful number of DirectX 12 games are released.
 
I'm with you, and been saying it as well. The number of DirectX 12 games is not going to grow at any exponential rates this first year as some keep predicting. W10 is only hovering around 20%. lilhasselhoffer remarked, correctly I think, that the initial 16% in the first month was that quick rush to partake in new hardware, yet despite being free, the adoption rate has slowed.

Game manufacturers are going to concentrate on the majority for at least the next year, which right now is using DirectX 11. Hell, look at all the huge games released this year, and even post W10: DirectX 11. Upcoming AAA titles also, DirectX 11.

So while AMD appears to perform better in DirectX 12 now, Nvidia will be following suit in the Spring with DirectX 12 GPU's, and be performing presumably at the same level by the time any meaningful number of DirectX 12 games are released.

dont forget that all new Xbox one games will be DX12, so they may not hit PC til next year but the numbers could climb fast due to them. Part of the idea there was all they need is an alternate control scheme and they're halfway to a PC port.
 
dont forget that all new Xbox one games will be DX12, so they may not hit PC til next year but the numbers could climb fast due to them. Part of the idea there was all they need is an alternate control scheme and they're halfway to a PC port.

I agree with you! As you say though, "next year" which is why I assert it is not a huge deal right now that AMD performs better in DX 12.
 
I can't see either AMD or NVIDIA ~not~ taking an active role in encouraging game developers to utilize their own DX-12 technologies.
Both of them have already helped with proper coding of games in the past because it's in their best interests to do just that.
I would love to see them approach like solutions to solve in-game challenges so that there will be standards that everyone ( GPU ~and~ Monitor makers) can cleave to.

This might lead to simpler choices for those of us who want to upgrade our LED monitors without having to buy into one company's technology and not be able to use GPUs from the other company.
The G-Sync premium is an unnecessary expense being tacked onto monitors.
 
I can't see either AMD or NVIDIA ~not~ taking an active role in encouraging game developers to utilize their own DX-12 technologies.
Both of them have already helped with proper coding of games in the past because it's in their best interests to do just that.
I would love to see them approach like solutions to solve in-game challenges so that there will be standards that everyone ( GPU ~and~ Monitor makers) can cleave to.

This might lead to simpler choices for those of us who want to upgrade our LED monitors without having to buy into one company's technology and not be able to use GPUs from the other company.
The G-Sync premium is an unnecessary expense being tacked onto monitors.

No it won't. Either company just wants to make money. If their shit works, it works, proprietary or not. If customers buy that shit, they have already won the game. There is no need to keep investing in universal standards. Look at Apple, and mini-USB and you see why this is true.

People need to get those AMD fairy tales out of their head. This perfect coalition is not going to happen, it is utopia. Companies WANT to differentiate their products, they call that USP's.

For this reason I just avoid 'sync' screens altogether, it is a price premium for a minimal advantage. Vsync still exists and 90% of all games can handle a little input lag just fine. I put that money towards more GPU and lock stuff at 120fps/hz instead. Done deal, no vendor lock in, more performance at equal cost.
 
Last edited:
I'm with you, and been saying it as well. The number of DirectX 12 games is not going to grow at any exponential rates this first year as some keep predicting. W10 is only hovering around 20%.
I agree with you! As you say though, "next year" which is why I assert it is not a huge deal right now that AMD performs better in DX 12.
What will be thing, I would expect that games will have DX12 low lvl in them but they will also have DX11 fall back for people still on 7/8. Which will limit performance gains in games since draw calls like will be limited to DX11 speeds. That will help AMD yes but likely won't give them that much of a lead if any. Be like probably BF4 got mantle, did give AMD only like 10% over nvidia on higher end machines.
 
Status
Not open for further replies.
Back
Top