Thursday, August 22nd 2019

AMD CEO Lisa Su: "CrossFire Isn't a Significant Focus"

AMD CEO Lisa Su at the Hot Chips conference answered some questions from the attending press. One of these regarded AMD's stance on CrossFire and whether or not it remains a focus for the company. Once the poster child for a scalable consumer graphics future, with AMD even going as far as enabling mixed-GPU support (with debatable merits). Lisa Su came out and said what we all have been seeing happening in the background: "To be honest, the software is going faster than the hardware, I would say that CrossFire isn't a significant focus".

There isn't anything really new here; we've all seen the consumer GPU trends as of late, with CrossFire barely being deserving of mention (and the NVIDIA camp does the same for their SLI technology, which has been cut from all but the higher-tier graphics cards). Support seems to be enabled as more of an afterthought than a "focus", and that's just the way things are. It seems that the old, old practice of buying a lower-tier GPU at launch and then buying an additional graphics processor further down the line to leapfrog performance of higher-performance, single GPU solutions is going the way of the proverbial dodo - at least until an MCM (Multi-Chip-Module) approach sees the light of day, paired with a hardware syncing solution that does away with the software side of things. A true, integrated, software-blind multi-GPU solution comprised of two or more smaller dies than a single monolithic solution seems to be the way to go. We'll see.
Source: TweakTown
Add your own comment

88 Comments on AMD CEO Lisa Su: "CrossFire Isn't a Significant Focus"

#26
EarthDog
Both mGPU techs have gone the way of the dodo. Good riddance. :)

When it works scaling was always meh with maybe 50-70% average. Few better, more worse or none at all. Only if it is a NEED, say 4K with mid-range tier cards, can I see it being worthwhile.

In the words of Elsa....LET IT GO... LET IT GO! CANT HOLD IT BACK ANY MORE!!!!! :p

But but b...I can buy two cheap gpus to make a midrange card.... stooooooooopit..lol
Posted on Reply
#27
Fluffmeister
Back when the GTX 280 was king I played with TRI-SLI for a while and it was fun to play around with, but the buzz almost came from putting one GTX 280 in... then another... THEN ANOTHER.
Posted on Reply
#28
MonteCristo
It reminds me .. single core CPUs, then 2 cores.. now 32.. OK I know it is not the same with GPUs, but it all comes down to the right technology implementation, to start the dual GPU core magic!!!
;)
Posted on Reply
#29
Alpha_Lyrae
kapone32You might be right as you no longer need a bridge connection to enable crossfire on AM4 or X399 boards. And just about all of them have crossfire support written on the box.
You no longer need a bridge connection because AMD gaming GPUs use a XDMA engine in-die to communicate over PCIe. Each GPU can access the other's memory directly. It's been this way since Hawaii (R9 290X) after AMD's frametimes were shown to be abysmal in previous Crossfire. Once this is removed in hardware, Crossfire support can no longer be enabled (Radeon VII and 5700/XT).

Radeon Instinct MI50 and MI60 need a bridge for Infinity Fabric Link to support high-bandwidth transfers (200GB/s+).

I run 2 Vega64s in CF and it's mostly fine. Newer driver features like Enhanced Sync or even Freesync cause stuttering though, so as long as you know about the limitations, it's okay. Using ReLive during Crossfire gameplay also reduces performance as it triggers adaptive GPU clocking, which is normally disabled in Crossfire to maximize XDMA performance.

Drivers after 19.7.1 have Crossfire profiles missing too.
Posted on Reply
#30
Space Lynx
Astronaut
moproblems99It could be easy as heck but you seem to be missing the point that developers have absolutely no reason to spend a dime on it. They are already complaining about money and crunch.

if you were in their shoes dealing with budget and crunch woes, would you create more work for yourself for the whole two people in the world that use crossfire? That will give you absolutely zero returns?
If it was easy as heck then the free markets would supply an employee that can do it rather cheaply or subsidized by M$. Similar to how Nvidia sends some of its engineers to game studios to help them out from time to time. So it must not be all that important to M$ besides a pretty headline 4 years ago. /shrug

Any other thoughts captain?
Posted on Reply
#31
Dave65
I couldn't care less about CF or SLI.
Posted on Reply
#32
Keullo-e
S.T.A.R.S.
R9 290 CF user here. Not my first multi-GPU setup, but when it works, second card gives a nice boost. I'd say that the thermals are the worst problem (only one card is watercooled).
Posted on Reply
#33
Mamya3084
I have 2 Vega FEs in crossfire. When it works, it's awesome, however, the latest drivers have caused nothing but trouble.
Posted on Reply
#34
TheGuruStud
Dual GPU has been dead since 290x. Devs don't care. They can't even release a game without a day one 15GB patch. It's all a joke just like gaming in general, today. AAA titles are crap, microtransactions (along with pay2win), sold on lies, etc.

Remember when gaming was good? Pepperidge Farm remembers.
Posted on Reply
#35
Fluffmeister
Yeah, sadly the modern consoles are weak. Hence the whole industry is being held back.
Posted on Reply
#36
Shadowdust
I got into Crossfire with my old HD 4850 setup. It was fun to play with but honestly I spent more doing that then just buying a high end card. I actually had a 4850x2 plus one more 4850 in Crossfire and man that was a heater. I haven't really done Crossfire since. I've found that buying a sub $500 GPU tends to be a more efficient and cost effective way to manage my setup.
Posted on Reply
#37
dinmaster
Way forward is the way of their cpu's chiplets. Once they do that on gpu's it will be a big step forward.
Posted on Reply
#38
moproblems99
lynx29If it was easy as heck then the free markets would supply an employee that can do it rather cheaply or subsidized by M$. Similar to how Nvidia sends some of its engineers to game studios to help them out from time to time. So it must not be all that important to M$ besides a pretty headline 4 years ago. /shrug

Any other thoughts captain?
First, what does a free market have to do with anything? Microsoft doesn't care. They make an operating system, a graphics api to go with it, and publish a few games. Again, Microsoft really doesn't stand much to gain either. People already have Windows whether mGPU works or not. They aren't going to make anymore money.

If anyone stands to gain from mGPU it AMD. If mGPU worked well AMD could sell two 580s at a pop so someone could get 2080 performance or two 5700s to get a 2080ti. nVidia probably doesn't care too much about it because they make way more money on 2080+ gpus then they would on the lower cards. I'd wager they make more money on a single 2080S than they do on two 2060S purchases. Probably why they have been slowly raising the bar for the gpus that can do sli - 960, 1070, 2080.

With all that, both of the GPU makers say it isn't worth spending resources developing drivers and helping studios implement mGPU. Why? Because almost no one uses it.
Posted on Reply
#39
Mephis
lynx29www.pcgamer.com/directx-12-will-be-able-to-use-your-integrated-gpu-to-improve-performance/

Multiadapter was supposed to be able to make it easy for developers to utilize multiple gpu's any gpu even integrated all at the same time. apparently MS didn't make it easy enough though because no one used it.
From the very article you linked:

"It may be free performance for gamers, but that doesn't mean it's free for developers to implement. As PCPer points out, "Unlinked Explicit Multiadapter is also the bottom of three-tiers of developer hand-holding. You will not see any benefits at all, unless the game developer puts a lot of care in creating a load-balancing algorithm, and even more care in their QA department to make sure it works efficiently across arbitrary configurations."

And:

"Likewise, DirectX12 making it possible for Nvidia and AMD graphics cards to work together doesn't guarantee either company will happily support that functionality. "
lynx29If it was easy as heck then the free markets would supply an employee that can do it rather cheaply or subsidized by M$. Similar to how Nvidia sends some of its engineers to game studios to help them out from time to time. So it must not be all that important to M$ besides a pretty headline 4 years ago. /shrug

Any other thoughts captain?
You seem to be the only one saying it was easy. The very article pointed out that it wouldn't be easy and that the game developer and hardware driver teams would still have to dedicated resources to it. Just because you decided something should be easy, doesn't mean anyone lied to you, especially when they said that it wouldn't be from the beginning.
Posted on Reply
#40
Space Lynx
Astronaut
MephisFrom the very article you linked:

"It may be free performance for gamers, but that doesn't mean it's free for developers to implement. As PCPer points out, "Unlinked Explicit Multiadapter is also the bottom of three-tiers of developer hand-holding. You will not see any benefits at all, unless the game developer puts a lot of care in creating a load-balancing algorithm, and even more care in their QA department to make sure it works efficiently across arbitrary configurations."

And:

"Likewise, DirectX12 making it possible for Nvidia and AMD graphics cards to work together doesn't guarantee either company will happily support that functionality. "




You seem to be the only one saying it was easy. The very article pointed out that it wouldn't be easy and that the game developer and hardware driver teams would still have to dedicated resources to it. Just because you decided something should be easy, doesn't mean anyone lied to you, especially when they said that it wouldn't be from the beginning.
AMD/Nvidia often send their own engineers to game studios to push branded specific features, M$ could have done this as well, they just prefer hoarding their money and resources instead, and talking a lot of crap.
Posted on Reply
#41
Mephis
lynx29AMD/Nvidia often send their own engineers to game studios to push branded specific features, M$ could have done this as well, they just prefer hoarding their money and resources instead, and talking a lot of crap.
Ok, obviously you have created a narrative for yourself and no amount of facts and proof is going to change your mind.
Posted on Reply
#42
Frick
Fishfaced Nincompoop
I would have really liked a world in which hybrid crossfire worked well. Have an APU, add a GPU and that performance is just added on top of the APU performance.
Posted on Reply
#43
renz496
Markosz"To be honest, the software is going faster than the hardware, I would say that CrossFire isn't a significant focus"

Isn't that the perfect situation to make multiple GPUs work? I mean if they can't keep up with software's hardware demand with a single card, then "just throw in more".
Since AMD doesn't have a real high-end card it would seem like a good idea to focus on CrossFire, but I guess it's not only their fault. Game developers would need a bit of extra work to make scaling efficient and that's not worth it for them, or either their partner GPU vendor (Which is usually Nvidia with their ****Works)

A dual 5700 XT (or many cheap cards) setup would beat a 2080 TI for much less money in ideal situations and that would make the most expensive card irrelevant.
it has nothing to do with nvidia really. both SLI and CF using same tech called AFR. modern game engine simply did not like how AFR work.
lynx29www.pcgamer.com/directx-12-will-be-able-to-use-your-integrated-gpu-to-improve-performance/

Multiadapter was supposed to be able to make it easy for developers to utilize multiple gpu's any gpu even integrated all at the same time. apparently MS didn't make it easy enough though because no one used it.
it was supposed to make it more easy but it doesn't mean game developer will suddenly embrace it because it is much easier to do. regardless of being easy or not it still regarded as more extra work. and not many people with such setup. game developer most often already very busy to fix their game post launch. adding multi GPU will add more problems.
MonteCristoIt reminds me .. single core CPUs, then 2 cores.. now 32.. OK I know it is not the same with GPUs, but it all comes down to the right technology implementation, to start the dual GPU core magic!!!
;)
lol our GPU already have thousands of cores inside them called shaders.
Posted on Reply
#44
er557
RH92Really ? A single 1080Ti was already achieving 4K/60fps on most games back in 2017 and 2080Ti is definitely achieving well above 60fps in pretty much all games ( unless it's some poorly optimised junk ) !

Maybe you are not up to date with GPU tech ......
You clearly haven't done your research, here is a chart with 2080 ti @4k ultra settings, tell me if those frames are all playable. good luck with that



this is the article it is taken from
www.gpucheck.com/gpu/nvidia-geforce-rtx-2080-ti/intel-core-i7-7700k-4-20ghz/
Posted on Reply
#45
Easo
People here railing on MS about DX12 - you really think NVidia or AMD really wants to spend resources to make use of the feature to marry together opposing GPUs? Or game developers?
Be realistic, please.
er557You clearly haven't done your research, here is a chart with 2080 ti @4k ultra settings, tell me if those frames are all playable. good luck with that
Every single one of those games are playable at those framerates. Rephrase the question to "Is this 4K60 gaming?" and then the answer indeed is no.
Posted on Reply
#46
laszlo
i bought a while ago another hd6950 for CF and was the worst decision ever; i had higher fps with one card than with two in CF..., in newer games which didn't supported it however...

in my opinion CF & SLI is a waste of money especially due higher power consumption, if games don't support these features
Posted on Reply
#47
er557
no, 60hz, and 40 fps is not considered good, when SLI gives you double that or at the very least 50% more.
Posted on Reply
#48
dyonoctis
To be honest, I always thought that there's a conflit between pushing graphics further, and playing at higher resolution. We haven't even reached the point were 4k gaming is really mainstream, and we are already hearing stuff about 8k gaming...I'm starting to doubt that silicon will ever make 4k60 a reality on mainstream segment, unless we stop pushing better graphics for a while.
Posted on Reply
#49
Mephis
er557no, 60hz, and 40 fps is not considered good, when SLI gives you double that or at the very least 50% more.
That isn't what you asked. You asked if they were all "playable". And yes, every game in that chart was playable at those frame rates. Are they perfect or ideal? No, so turn down the quality settings.
Posted on Reply
#50
er557
To each their own, but my scenario is playing games that are a bit older, sli is supported usually, and I crank up all the settings and include AA, I dont want to turn down quality. So in this scenario, SLI is very feasible, as it allows me to max any title on my monitor which is 4K, and I also dont care for 120 fps, all i need is fast vsync @ 60 hz.
Posted on Reply
Add your own comment
Apr 16th, 2024 18:33 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts