• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Phasing Out CrossFire Brand With DX 12 Adoption, Favors mGPU

......but ....but 2+ gpu looks so cool.....how else are we to let the world know we're boss.....sad face.....
 
Not quite. For Crossfire, it was up to AMD to create functional profiles for a game. With mGPU, it's entirely up to game developers to add support for multi-GPU. Which also means the engine has to run DirectX 12, because DX11 doesn't even support such a thing. Not sure how it's with Vulkan in this regard...
well, not exactly.
while crossfire has become mgpu as amd's marketing term for the thing, nothing really changed in technology itself.
story with dx11 and opengl stays the same. vulkan is a bit up in the air.
for dx12 it matters if we are talking about implicit or explicit multiadapter.
implicit is pretty much the same as it is with dx11.
dx12 and explicit is what you are talking about. also, that comes in two flavours - linked (two gpu-s, logically similar to sli/xf) and unlinked (appear as one adapter).
 
Hence why I don't do multi-GPU anymore. It normally works well enough in the games I play that I was happy but its becoming a problem for AMD and Nvidia to keep up with. I am probably not going to build multi-GPU systems anymore for myself personally so this will probably not matter to me.
 
I thought the whole industry was done with that colossal failure multi GPU setups were? And now I see AMD simply calls it a different name. WTF?
Its DX12 which is a Microsoft product that is doing away with Xfire and SLI being controlled in the driver domain, and instead moving it to hard coding in the application. This has little to do with AMD.
 
I like progress, away with crossfire and sli and welcome mGPU.

I doubt many game devs are going to take the time to implement this. It makes zero sense for them financially for 0.0001% of their player base.
 
I doubt many game devs are going to take the time to implement this. It makes zero sense for them financially for 0.0001% of their player base.

Nail, meet head. There's no incentive for them. The only group to stand to gain from mgpu are AMD and Nvidia, yet they do very little subsidizing in regards to game developers. When we think about it is quite ironic.
 
Nail, meet head. There's no incentive for them. The only group to stand to gain from mgpu are AMD and Nvidia, yet they do very little subsidizing in regards to game developers. When we think about it is quite ironic.

Nice extrapolation on my thoughts, exactly what I was thinking. As you stated it only made sense for Nvidia or AMD to do it because they can make cash off mgpu sales. Game Devs do not and unfortunately with AMD dropping support it's going the way of the dodo. I don't think it will be a huge deal though because AMD are planning to release their scalable Navi architecture which should allow them to make a GPU of any size they need. I'm sure Nvidia already has one in development as well, they have even said that MCM (multi-chip module) is the future.

GPUs stand to benefit the most from MCM because of their massive die size. As your increase die size, cost goes up exponentially. MCM technology in GPUs would reduce the cost to produce high end GPUs by a large amount and it would allow them to scale the product up to almost any size. I say almost because there is a limit on the number of dies that can put put on a single chip with current tech.
 
Thing is, developers generally aren't going to support mGPU unless they're paid to do it by AMD or NVIDIA. It's yet another thing to debug that doesn't really improve their game in a way that matters.
 
Nail, meet head. There's no incentive for them. The only group to stand to gain from mgpu are AMD and Nvidia, yet they do very little subsidizing in regards to game developers. When we think about it is quite ironic.
Tbh, this could make a difference for professional software. There's always someone out there that needs to squeeze just a bit more juice out of any system. But it's really a solution in search of a problem for home users.
 
Thing is, developers generally aren't going to support mGPU unless they're paid to do it by AMD or NVIDIA.

If Nvidia can do it with GameWorks and Physx , they can do it with this too.
 
They can't because of the closer-to-the-metal nature of D3D12 and Vulkan. Drivers no longer have the data they need to split frame rendering to multiple cards.
 
If Nvidia can do it with GameWorks and Physx , they can do it with this too.
They can't because of the closer-to-the-metal nature of D3D12 and Vulkan. Drivers no longer have the data they need to split frame rendering to multiple cards.

I think you missed Vya's point which if I have it right means that since Nvidia was successful subsidizing something as divisive and pervasive as Gameworks, they could do the same for mgpu. This is more about the business of it and less about the technical merits.
 
Gameworks isn't as successful as you think. Case in point: if the game isn't TWIMTBP logo'd, GameWorks is rarely supported. Again, we're only going to see mGPU in games that AMD/NVIDIA pay for it to be included.
 
Gameworks isn't as successful as you think. Case in point: if the game isn't TWIMTBP logo'd, GameWorks is rarely supported. Again, we're only going to see mGPU in games that AMD/NVIDIA pay for it to be included.

lol everyone's a contrarian on the internets.
 
How many new releases actually get 90%+ framerate improvement running Crossfire/SLI? 1%? Even that? AMD and NVIDIA both ended 3- and 4- way support. They have also both strongly hinted that days are running out for 2- way. MGPU, going forward, is in the developers hands, not NVIDIA and not AMD.

I can only name two games that use GameWorks off the top of my head: Arkham series and Witcher 3. PhsyX was open sourced not too long ago because outside of UE4 and a few other games, it doesn't get used.
 
Last edited:
Gameworks isn't as successful as you think. Case in point: if the game isn't TWIMTBP logo'd, GameWorks is rarely supported. Again, we're only going to see mGPU in games that AMD/NVIDIA pay for it to be included.

That's likely because GameWorks doesn't have a good track record. I can't think of one time GameWorks made me go "wow" in a positive way. I might go "wow that really tanks FPS" though. Not only does GameWorks bork AMD cards and prevent AMD from optimizing for the game but it screws over previous gen Nvidia cards as well. GTX 900 series comes out, Nvidia way over tessellates grass in Crysis 2, what do you know the 700 series takes a huge performance hit with tessellation enabled. The 900 series can handle a massive amount of tessellation with little issue, Nvidia exploited it.

GameWorks is more a "We pay the devs so they can put in features that heavily favor our new cards to push sales".
 
GameWorks only exists to convince developers to get into the NVIDIA ecosystem not unlike what Microsoft did with DirectX. There's open source (some by AMD) or engine tools (by like-minded developers) available to do everything GameWorks does without being trapped in an ecosystem.
 
That's likely because GameWorks doesn't have a good track record. I can't think of one time GameWorks made me go "wow" in a positive way. I might go "wow that really tanks FPS" though. Not only does GameWorks bork AMD cards and prevent AMD from optimizing for the game but it screws over previous gen Nvidia cards as well. GTX 900 series comes out, Nvidia way over tessellates grass in Crysis 2, what do you know the 700 series takes a huge performance hit with tessellation enabled. The 900 series can handle a massive amount of tessellation with little issue, Nvidia exploited it.

GameWorks is more a "We pay the devs so they can put in features that heavily favor our new cards to push sales".
Apart from a few titles, maxing out whatever GameWorks brings to the table, also brings the current high-end cards to their knees. And guess what, that's no different from maxing out setting in games without GameWorks.
Sure, GameWorks put additional stress on the GPU (additional features do that), but inferring GameWorks was developed solely as a means to push newer cards is a little out there, imho. Especially since the competition at the high end is pretty much MIA for a few years now.

PS Witcher 3 is a GameWorks title that wows. But then again, Withcher 3 wasn't GameWorks, it still wowed in its time (many couldn't believe it was "only" a DX9 title).
 
What about Hybrid Crossfire? APU+GPU nevee really worked well, but I like the idea, especially if the APUs become more powerful.
 
What about Hybrid Crossfire? APU+GPU nevee really worked well, but I like the idea, especially if the APUs become more powerful.
Think about it for a bit. When you upgrade your video card from one generation to the next and get 30-50% more HP, best case scenario, a game that ran at 20 fps will run at 30 fps, 40 becomes 60, 60 becomes 90 and 100 becomes 150. Going from 20 to 30 makes an unplayable title possibly playable (if it's a TBS or smth), 40 to 60 is much better (though an avg of 60 means it will still be below that ~50% of the time) and in the other cases the games remain playable, but you can now up the settings.
This is an acceptable gain (though in practice I find myself often skipping a generation). But an IGP will not add as much HP to your rig, thus you'll gain a lot less. Is it worth the hassle?
 
If now we have to wait for devs to develop multi GPU engines....then we have to wait. Probably for a looooooooooooooooooooooooooooooooooooooooong time.
 
Think about it for a bit. When you upgrade your video card from one generation to the next and get 30-50% more HP, best case scenario, a game that ran at 20 fps will run at 30 fps, 40 becomes 60, 60 becomes 90 and 100 becomes 150. Going from 20 to 30 makes an unplayable title possibly playable (if it's a TBS or smth), 40 to 60 is much better (though an avg of 60 means it will still be below that ~50% of the time) and in the other cases the games remain playable, but you can now up the settings.
This is an acceptable gain (though in practice I find myself often skipping a generation). But an IGP will not add as much HP to your rig, thus you'll gain a lot less. Is it worth the hassle?

Yeah, if it'd work well. And it would be nice to get the same boost just adding a GPU that cost a lot less than you'd have to spend to otherwise get the same performance.
 
Yeah, if it'd work well. And it would be nice to get the same boost just adding a GPU that cost a lot less than you'd have to spend to otherwise get the same performance.
Well, if you upgrade, you get to sell your old video card, so upgrading is already quite cheap.
 
Back
Top