Thursday, August 22nd 2019

AMD CEO Lisa Su: "CrossFire Isn't a Significant Focus"

AMD CEO Lisa Su at the Hot Chips conference answered some questions from the attending press. One of these regarded AMD's stance on CrossFire and whether or not it remains a focus for the company. Once the poster child for a scalable consumer graphics future, with AMD even going as far as enabling mixed-GPU support (with debatable merits). Lisa Su came out and said what we all have been seeing happening in the background: "To be honest, the software is going faster than the hardware, I would say that CrossFire isn't a significant focus".

There isn't anything really new here; we've all seen the consumer GPU trends as of late, with CrossFire barely being deserving of mention (and the NVIDIA camp does the same for their SLI technology, which has been cut from all but the higher-tier graphics cards). Support seems to be enabled as more of an afterthought than a "focus", and that's just the way things are. It seems that the old, old practice of buying a lower-tier GPU at launch and then buying an additional graphics processor further down the line to leapfrog performance of higher-performance, single GPU solutions is going the way of the proverbial dodo - at least until an MCM (Multi-Chip-Module) approach sees the light of day, paired with a hardware syncing solution that does away with the software side of things. A true, integrated, software-blind multi-GPU solution comprised of two or more smaller dies than a single monolithic solution seems to be the way to go. We'll see.
Source: TweakTown
Add your own comment

88 Comments on AMD CEO Lisa Su: "CrossFire Isn't a Significant Focus"

#1
kapone32
That really sucks for those of us that use Crossfire. Though many games don't support it, the ones that do shine brightly. I was contemplating getting the 5700XT but I will probably get a used Vega 7 instead when the prices come down.
Posted on Reply
#2
lynx29
I mean they are struggling to get even single gpu drivers stable this round... so... yeah I kind of assumed this. I really hope their extra revenue allows them to hire some more engineers for gpu drivers, cause too many people have been having issues...
Posted on Reply
#3
kapone32
lynx29, post: 4102762, member: 153071"
I mean they are struggling to get even single gpu drivers stable this round... so... yeah I kind of assumed this. I really hope their extra revenue allows them to hire some more engineers for gpu drivers, cause too many people have been having issues...
You might be right as you no longer need a bridge connection to enable crossfire on AM4 or X399 boards. And just about all of them have crossfire support written on the box.

lynx29, post: 4102762, member: 153071"
I mean they are struggling to get even single gpu drivers stable this round... so... yeah I kind of assumed this. I really hope their extra revenue allows them to hire some more engineers for gpu drivers, cause too many people have been having issues...
BTW I like your PIC. Is that from The Dragon Reborn?
Posted on Reply
#4
Aldain
Sadly both SLI/CF is going away of the DODO..
Posted on Reply
#5
lynx29
kapone32, post: 4102763, member: 181865"
You might be right as you no longer need a bridge connection to enable crossfire on AM4 or X399 boards. And just about all of them have crossfire support written on the box.



BTW I like your PIC. Is that from The Dragon Reborn?
it's the artwork from the first wheel of time book, the eye of the world.

Aldain, post: 4102765, member: 170164"
Sadly both SLI/CF is going away of the DODO..
yep and Directx12 promised us easy allocation of multiple graphical sources all at once for higher frames... lol what a lie M$
Posted on Reply
#6
64K
With very few Developers even supporting Crossfire/SLI anymore there is little point in AMD or Nvidia spending any time or money on it. One site I go to still has a GTX 690 for benches and with SLI enabled he usually gets the same FPS as running on a single GPU on that card so what's the point? This is maybe why the GTX 690 and R9 295 X2 were the last gaming dual GPUs. I don't count the Titan Z as a gaming card at $3000 but it was a dual GPU Kepler as well.

I am interested in the MCM though. It wouldn't need any support from Developers because, as I understand it, the multiple chips are treated as one GPU.

lynx29, post: 4102766, member: 153071"
yep and Directx12 promised us easy allocation of multiple graphical sources all at once for higher frames... lol what a lie M$
MS just said that crap about DX12 being able to pair an AMD card with a Nvidia card to push Win 10. They knew they couldn't do it.
Posted on Reply
#7
yakk
Crossfire used to be a requirement for a game to be considered for AMD sponsorship.

But now with the low lag times required by some games and showing up badly on graphs I can see why hardware vendors are moving away from it. I guess my next upgrade will be crossfire free for the first time in over a decade...
Posted on Reply
#8
er557
loyal SLI user here, happy with it, It is supported in all drivers released and I play a bit older games anyway.
BTW, it is nice to have, now with pascal RTX support, every performance bit is useful.
Posted on Reply
#9
Kohl Baas
To be honest, technically I simply don't know what happend with that 3D/VR approach to have 2 separate graphycs card both working solely on the images of one eye. They promised it at the 3D craze, but nothing happend. They also promised it at the VR-craze a decade later but again, nothing happend...
Posted on Reply
#10
Markosz
"To be honest, the software is going faster than the hardware, I would say that CrossFire isn't a significant focus"

Isn't that the perfect situation to make multiple GPUs work? I mean if they can't keep up with software's hardware demand with a single card, then "just throw in more".
Since AMD doesn't have a real high-end card it would seem like a good idea to focus on CrossFire, but I guess it's not only their fault. Game developers would need a bit of extra work to make scaling efficient and that's not worth it for them, or either their partner GPU vendor (Which is usually Nvidia with their ****Works)

A dual 5700 XT (or many cheap cards) setup would beat a 2080 TI for much less money in ideal situations and that would make the most expensive card irrelevant.
Posted on Reply
#11
dozenfury
There have been a few factors that have lead to this I think. One is the general inflation in video card pricing, where $700-$1000 for a gaming PC video card is kind of the norm. Years ago when cards were generally $300-$400 the idea of doing 2 cards in SLI/Crossfire was expensive but more feasible.

The other big factor I think was DX12 and the thinking that it would lead to multiplying gpu power by leveraging multiple cards even of different makes and models. In practice it ended up not being all it was cracked up to be in that aspect. Even though it didn't make SLI/Crossfire obsolete, DX12 looking like it was going to make those obsolete I'm sure didn't help with R&D funds being allocated to those areas.

And lastly GPU power reached the point that you can game comfortably even with high resolutions and maxxed settings with 1080ti+ type cards without the need for SLI/Crossfire. And with all of the driver issues and bugs that SLI/Crossfire add, it's just not worth the trouble.
Posted on Reply
#12
er557
Cant play @4k ultra in most games today without SLI
Posted on Reply
#13
Razrback16
kapone32, post: 4102760, member: 181865"
That really sucks for those of us that use Crossfire. Though many games don't support it, the ones that do shine brightly. I was contemplating getting the 5700XT but I will probably get a used Vega 7 instead when the prices come down.
No doubt. I originally got into multi gpu with a pair of 4870s back in the day and loved it. With all but one upgrade, I've continued to run multi gpu to this day and I've had great experiences. No single GPU out there can do 4k / 60fps steady with every setting maxed in the games I play but my SLI setup does, and it's butter smooth, too. Too bad from the AMD angle because since they can't compete on the high end with their GPUs toe to toe with NVidia - their only chance for competing was to find a way to make their multi gpu work better than NVidia's and they apparently aren't interested, so they've effectively eliminated themselves as a potential hardware upgrade path for me. Can only hope Intel does a better job to give NVidia some competition.
Posted on Reply
#14
RichF
The fraud that is the "console" is a gift that just keeps giving.
Posted on Reply
#15
Mephis
lynx29, post: 4102766, member: 153071"
it's the artwork from the first wheel of time book, the eye of the world.



yep and Directx12 promised us easy allocation of multiple graphical sources all at once for higher frames... lol what a lie M$
How exactly did MS lie? They always said that with DX12 the burden of getting multi GPU to work shifts to the game developer from the driver team. Unfortunately game developers have no incentive to make multi GPU work. They gain nothing from spending the extra resources and the number of crossfire/sli systems was always small.
Posted on Reply
#16
PetBB
kapone32, post: 4102760, member: 181865"
That really sucks for those of us that use Crossfire. Though many games don't support it, the ones that do shine brightly. I was contemplating getting the 5700XT but I will probably get a used Vega 7 instead when the prices come down.
Radeon VII doesn't support Crossfire.
Posted on Reply
#17
Metroid
i hope it stays dead, last time amd put 2 rx 480 together to compete x 1070 and said it was better hehe.
Posted on Reply
#18
RH92
er557, post: 4102783, member: 90273"
Cant play @4k ultra in most games today without SLI
Really ? A single 1080Ti was already achieving 4K/60fps on most games back in 2017 and 2080Ti is definitely achieving well above 60fps in pretty much all games ( unless it's some poorly optimised junk ) !

Maybe you are not up to date with GPU tech ......
Posted on Reply
#19
Patriot
DX12 has killed off sli and crossfire.
And almost no vendor uses the native multi-gpu support.
Posted on Reply
#20
Aquinus
Resident Wat-man
64K, post: 4102770, member: 148270"
I am interested in the MCM though. It wouldn't need any support from Developers because, as I understand it, the multiple chips are treated as one GPU.
You took the words right out of my keyboard. MCM is an excellent approach to the scaling problem. This is an area where AMD has a lot more experience under their belt now and I can totally see them taking advantage of this. Fiji might have been "meh", but it was the first time we really saw AMD try to do the multi-chip design and what we've seen are huge advances in how they do it, from going to HBM2 to communicating over IF literally everywhere else.

Bottom line is that it has the same advantages as AMD's other MCM offerings, so why the hell not? Even if they were to fail, it would be a valiant effort.
Posted on Reply
#21
lynx29
Mephis, post: 4102824, member: 186806"
How exactly did MS lie? They always said that with DX12 the burden of getting multi GPU to work shifts to the game developer from the driver team. Unfortunately game developers have no incentive to make multi GPU work. They gain nothing from spending the extra resources and the number of crossfire/sli systems was always small.
https://www.pcgamer.com/directx-12-will-be-able-to-use-your-integrated-gpu-to-improve-performance/

Multiadapter was supposed to be able to make it easy for developers to utilize multiple gpu's any gpu even integrated all at the same time. apparently MS didn't make it easy enough though because no one used it.
Posted on Reply
#22
danbert2000
Kohl Baas, post: 4102776, member: 159940"
To be honest, technically I simply don't know what happend with that 3D/VR approach to have 2 separate graphycs card both working solely on the images of one eye. They promised it at the 3D craze, but nothing happend. They also promised it at the VR-craze a decade later but again, nothing happend...
It looks like VR SLI is what you were talking about, and it is somewhat still supported by Nvidia, but developing your game to target VR and SLI users is focusing a lot of energy on a fraction of a fraction of the market who has both technologies.

https://developer.nvidia.com/vrworks/graphics/vrsli

Also, after simultaneous multiprojection came out, there was less of a performance gain from using two cards instead of one. When one card can process both geometries without a performance hit, your main reason for having two GPUs is gone. And it's a lot easier to code for, and you don't run into situations where the framerate differs between the eyes, which I could imagine being an extremely uncomfortable sensation. Especially since the VR headsets are currently running at fixed framerates with VSync, so you would essentially have one eye seeing a frame twice while the other sees something different. It would probably feel like crossing your eyes or having a stroke.

Here's the information on SMP

https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/11
Posted on Reply
#24
ZoneDymo
kapone32, post: 4102760, member: 181865"
That really sucks for those of us that use Crossfire. Though many games don't support it, the ones that do shine brightly. I was contemplating getting the 5700XT but I will probably get a used Vega 7 instead when the prices come down.
I used crossfire in the past and like SLI its just not worth the annoyances it brings.
Posted on Reply
#25
moproblems99
lynx29, post: 4102870, member: 153071"
Multiadapter was supposed to be able to make it easy for developers to utilize multiple gpu's any gpu even integrated all at the same time. apparently MS didn't make it easy enough though because no one used it.
It could be easy as heck but you seem to be missing the point that developers have absolutely no reason to spend a dime on it. They are already complaining about money and crunch.

if you were in their shoes dealing with budget and crunch woes, would you create more work for yourself for the whole two people in the world that use crossfire? That will give you absolutely zero returns?
Posted on Reply
Add your own comment