Thursday, August 22nd 2019

AMD CEO Lisa Su: "CrossFire Isn't a Significant Focus"

AMD CEO Lisa Su at the Hot Chips conference answered some questions from the attending press. One of these regarded AMD's stance on CrossFire and whether or not it remains a focus for the company. Once the poster child for a scalable consumer graphics future, with AMD even going as far as enabling mixed-GPU support (with debatable merits). Lisa Su came out and said what we all have been seeing happening in the background: "To be honest, the software is going faster than the hardware, I would say that CrossFire isn't a significant focus".

There isn't anything really new here; we've all seen the consumer GPU trends as of late, with CrossFire barely being deserving of mention (and the NVIDIA camp does the same for their SLI technology, which has been cut from all but the higher-tier graphics cards). Support seems to be enabled as more of an afterthought than a "focus", and that's just the way things are. It seems that the old, old practice of buying a lower-tier GPU at launch and then buying an additional graphics processor further down the line to leapfrog performance of higher-performance, single GPU solutions is going the way of the proverbial dodo - at least until an MCM (Multi-Chip-Module) approach sees the light of day, paired with a hardware syncing solution that does away with the software side of things. A true, integrated, software-blind multi-GPU solution comprised of two or more smaller dies than a single monolithic solution seems to be the way to go. We'll see.
Source: TweakTown
Add your own comment

88 Comments on AMD CEO Lisa Su: "CrossFire Isn't a Significant Focus"

#76
AvrageGamr
Consoles killed Crossfire and SLI years ago. Waste of developer time and resources since the amount of pc users that use them are minimal anyway. And most pc games are console ports.
Posted on Reply
#77
RichF
Vayra86But we also place higher demands on our games now, the dependancy on VRAM I think was a catalyst for SLI's demise. Nvidia started killing it right at the same time AMD started looking at HBM; and Nvidia had to move to delta compression, and VRAM capacities doubled overnight.

Now look at today; high end GPU between Maxwell and Pascal gained another 4GB (970 > 1070) and the high end even goes to eleven ;)

This makes it even harder to sell 'wasted' hardware resources like doubled VRAM.
The solution to that is to have socketed VRAM, just as we have socketed RAM on motherboards.

In fact, it shouldn't be that difficult to have socketed GPUs, too. Perhaps, given the insanely-high cost of high-end GPUs these days, it's time to start demanding more instead of passively accepting the disposable GPU model. If a GPU board has a strong VRM system and is well-made, why replace that instead of upgrading the chip?

Personally, I'd like to see GPUs move to the motherboard and the ATX standard be replaced with a modern one that's efficient to cool. A vapor chamber that can cool both the CPU and the GPU could be nice (and the chipset — 40mm fans in 2019, really?). A unified VRM design would be a lot more efficient.

It's amusing that people balk at buying a $1000 motherboard but seem not to notice the strangeness of disposing of a $1200 GPU rather than being able to upgrade it.
Posted on Reply
#78
B-Real
kapone32That really sucks for those of us that use Crossfire. Though many games don't support it, the ones that do shine brightly. I was contemplating getting the 5700XT but I will probably get a used Vega 7 instead when the prices come down.
Well, you got ~52% performance for 100% more price. Literally you threw half the money spent on the second GPU out of the window. If they spend only 10% of the efforts put in Crossfire (or SLI, as NV also leaving the SLI train) on anything else, everyone will be happier.
Posted on Reply
#79
ssdpro
Good Lisa could say it plainly... stuffs for chumpz just like the 14.4k modem.
Posted on Reply
#80
er557
they can stop investing in it all they want, I'll do what fits performance best.
Posted on Reply
#81
Unregistered
er557they can stop investing in it all they want, I'll do what fits performance best.
Ya, I sure hope more developers follow the example done in the Tomb Raider games. I'm in the process of doing my first run through of those games, and they did such a nice job with DX12 mGPU implementation - currently playing "Rise". Having mGPU support built into DX12 & Vulkan so as not to need the driver profiles anymore is really cool.

And I'm with ya on multi gpu in general - you know, it's never been a technology that's for folks on a budget or looking for perfect scaling. We know we're not getting 100% scaling, it just comes down to wanting a certain level of image quality or framerates, etc. and knowing we can't get it with a single GPU.

I know not everyone is a fan and some folks have reported bad experiences, but IMO, as someone who has run the technology for years and years, I've had a great experience with it. Sure hope more developers make use of mGPU.
Posted on Edit | Reply
#82
RH92
er557You clearly haven't done your research, here is a chart with 2080 ti @4k ultra settings, tell me if those frames are all playable. good luck with that
this is the article it is taken from
www.gpucheck.com/gpu/nvidia-geforce-rtx-2080-ti/intel-core-i7-7700k-4-20ghz/
That's why you don't read only one review especially not when it comes from some low reputation website !

Here you have a 35 game sample

www.techspot.com/review/1701-geforce-rtx-2080/page2.html ,
www.techspot.com/article/1702-geforce-rtx-2080-mega-benchmark/ ,

from Techspot ( known as HardwareUnboxed on Youtube wich is a reference when it comes to GPU reviews ) . On those 35 games as an average in 4K 2080Ti is hitting 92fps in terms of average frames and 73,31fps in terms of 1% low frames , furthermore there are only 4 games out of 35 where 2080Ti doesn't hit 60fps ( still above 50fps ) . When you compare the review you provided there are plenty of weird results such as GTA 5 : GpuCheck 59fps vs Techspot 122fps !

Keep in mind this is a stock 2080Ti , with memory OC ( more important in 4K ) you are warrated to hit over 60fps in all those games ..... so yeah come again and tell about those unplayable games . Sure there might be some poorly optimized games here and there but other than that you have to be out of your mind to believe that 2080Ti can't drive 4K/60 !
Posted on Reply
#83
kapone32
B-RealWell, you got ~52% performance for 100% more price. Literally you threw half the money spent on the second GPU out of the window. If they spend only 10% of the efforts put in Crossfire (or SLI, as NV also leaving the SLI train) on anything else, everyone will be happier.
That is not the way you do crossfire. You buy the best GPU around launch (Vega 64) then when used GPUs (Vega 64) you get a second (which cost me less than half). BTW most of the games that I play fully support crossfire with more than 52% scaling especially TWWH. With that I go from 35 to 72 FPS @ 4K Extreme. Then other games I play like Strange Brigade and others also support crossfire through DX12 and then the best thing about crossfire is that if the game does not support it there is no power draw from the 2nd GPU. Indeed I have been using SLI/Crossfire since the GTS 450 days for....you guessed it Total War. I know that TW3K does not support crossfire but that game is meh compared to TWWH, in fact it is not to me as good as TW Shogun 2. There are still plenty of games that support crossfire including the Witcher series, Watchdogs, Xcom, Tomb raider and Project Cars series to name a few.
Posted on Reply
#84
renz496
kapone32That is not the way you do crossfire. You buy the best GPU around launch (Vega 64) then when used GPUs (Vega 64) you get a second (which cost me less than half). BTW most of the games that I play fully support crossfire with more than 52% scaling especially TWWH. With that I go from 35 to 72 FPS @ 4K Extreme. Then other games I play like Strange Brigade and others also support crossfire through DX12 and then the best thing about crossfire is that if the game does not support it there is no power draw from the 2nd GPU. Indeed I have been using SLI/Crossfire since the GTS 450 days for....you guessed it Total War. I know that TW3K does not support crossfire but that game is meh compared to TWWH, in fact it is not to me as good as TW Shogun 2. There are still plenty of games that support crossfire including the Witcher series, Watchdogs, Xcom, Tomb raider and Project Cars series to name a few.
nah. one of the primary reason you want multi GPU is because fastest single GPU can't provide you enough performance. this is the common thing we think about when it comes to multi GPU right now. but in the past majority of people (especially those that can't really afford high end hardware) they want multi GPU because that will allow them to have performance that is even faster than what fastest single GPU can provide but the cost to do it is even cheaper than buying one fastest single GPU. the caveat is they have to deal with the drawback of multi GPU. but the performance uplift in majority of games and the cost to make it happen should out weight the drawback. this is one of the primary example for it:

www.techpowerup.com/review/nvidia-geforce-gtx-460-sli/25.html
Posted on Reply
#85
kapone32
renz496nah. one of the primary reason you want multi GPU is because fastest single GPU can't provide you enough performance. this is the common thing we think about when it comes to multi GPU right now. but in the past majority of people (especially those that can't really afford high end hardware) they want multi GPU because that will allow them to have performance that is even faster than what fastest single GPU can provide but the cost to do it is even cheaper than buying one fastest single GPU. the caveat is they have to deal with the drawback of multi GPU. but the performance uplift in majority of games and the cost to make it happen should out weight the drawback. this is one of the primary example for it:

www.techpowerup.com/review/nvidia-geforce-gtx-460-sli/25.html
You are absolutely right.
Posted on Reply
#86
RichF
RH92That's why you don't read only one review especially not when it comes from some low reputation website !

Here you have a 35 game sample

www.techspot.com/review/1701-geforce-rtx-2080/page2.html ,
www.techspot.com/article/1702-geforce-rtx-2080-mega-benchmark/ ,

from Techspot ( known as HardwareUnboxed on Youtube wich is a reference when it comes to GPU reviews ) . On those 35 games as an average in 4K 2080Ti is hitting 92fps in terms of average frames and 73,31fps in terms of 1% low frames , furthermore there are only 4 games out of 35 where 2080Ti doesn't hit 60fps ( still above 50fps ) . When you compare the review you provided there are plenty of weird results such as GTA 5 : GpuCheck 59fps vs Techspot 122fps !

Keep in mind this is a stock 2080Ti , with memory OC ( more important in 4K ) you are warrated to hit over 60fps in all those games ..... so yeah come again and tell about those unplayable games . Sure there might be some poorly optimized games here and there but other than that you have to be out of your mind to believe that 2080Ti can't drive 4K/60 !
So, we've been getting propaganda to tell us 8K is really important, or, at the very least, something better than 4K is an important upgrade. But, in order to run good ole 4K well we need to spend... how much on exactly one GPU?

So, a reeeaallly expensive GPU that has zero competition in the market... Sounds like a recipe for a bargain, not the situation one is in when there is a monopoly that raises prices artificially.

(I've told people before that it's in Nvidia's interest to get rid of multi-GPU as long as AMD isn't competing at the high end. Since AMD is competing against the PC gaming platform by peddling console hardware it also doesn't have as much incentive to compete at the high end. Letting Nvidia increase prices helps it to peddle its midrange hardware at a price premium, undercutting Nvidia's higher premium. But, why not cheerlead for the situation where we can have any color we want as long as it's black?)

Sell a kidney to play at 4K or stick with 1440. Who needs dual GPU these days? Most everyone's got two kidneys.

(Given AMD's success with Zen 2 chiplets I would expect that future is going to be multi-GPU, only the extra GPU chips will be chiplets.)
Posted on Reply
#87
renz496
RichFSo, we've been getting propaganda to tell us 8K is really important, or, at the very least, something better than 4K is an important upgrade. But, in order to run good ole 4K well we need to spend... how much on exactly one GPU?

So, a reeeaallly expensive GPU that has zero competition in the market... Sounds like a recipe for a bargain, not the situation one is in when there is a monopoly that raises prices artificially.

(I've told people before that it's in Nvidia's interest to get rid of multi-GPU as long as AMD isn't competing at the high end. Since AMD is competing against the PC gaming platform by peddling console hardware it also doesn't have as much incentive to compete at the high end. Letting Nvidia increase prices helps it to peddle its midrange hardware at a price premium, undercutting Nvidia's higher premium. But, why not cheerlead for the situation where we can have any color we want as long as it's black?)

Sell a kidney to play at 4K or stick with 1440. Who needs dual GPU these days? Most everyone's got two kidneys.

(Given AMD's success with Zen 2 chiplets I would expect that future is going to be multi-GPU, only the extra GPU chips will be chiplets.)
But even if they are successful on that front i don't think it will solve the issue that many people did not like about high end gpu right now: the price.
Posted on Reply
#88
RH92
RichFSo, we've been getting propaganda to tell us 8K is really important, or, at the very least, something better than 4K is an important upgrade. But, in order to run good ole 4K well we need to spend... how much on exactly one GPU? So, a reeeaallly expensive GPU that has zero competition in the market... Sounds like a recipe for a bargain, not the situation one is in when there is a monopoly that raises prices artificially.
Totally unrelated to the topic wich is to know if single GPUs nowadays can drive 4K/60 Ultra .........
RichFI've told people before that it's in Nvidia's interest to get rid of multi-GPU as long as AMD isn't competing at the high end. Since AMD is competing against the PC gaming platform by peddling console hardware it also doesn't have as much incentive to compete at the high end. Letting Nvidia increase prices helps it to peddle its midrange hardware at a price premium, undercutting Nvidia's higher premium. But, why not cheerlead for the situation where we can have any color we want as long as it's black?)
Interesting theory of yours , now i would like to hear your theory on why AMD shares the same interest with Nvidia to get rid of multi-GPU www.techpowerup.com/258522/amd-ceo-lisa-su-crossfire-isnt-a-significant-focus considering they are far from having Nvidias position in the market and considering according your theory it would be in their interest to maintain mult-GPU support in order to stand a chance to challenge Nvidia's high end with their midrange GPUs .......... in other words your theory doesn't hold water !

It's ok i get it , bashing Nvidia for no reason will never get old for some peoples !
RichFSell a kidney to play at 4K or stick with 1440. Who needs dual GPU these days? Most everyone's got two kidneys.
I've been playing in 4K since 2014 , i started with R9 290 then moved R9 290 CF then to a single GTX 970 ( because of heat and noise reasons and because one of my 290s died on me ) then to GTX 1060 and recently to GTX 1080Ti wich can handle anything i throw at it in 4k/60 . All those GPU's i named prior to 1080Ti where perfectly able to play in 4K . Obviously sometimes you had to lower some settings depending the game and the GPU but with very minimal impact in terms of visual quality ( on most games there is barely any difference between Ultra and High ) . Nowadays GPUs like GTX 1080 can be found for dirty cheap and are perfectly able to handle 4K assuming you are not one of those '' ultra everything '' elitists and are smart enough to optimise your game settings in order to make the most out of your hardware !

So to answer your question no peoples don't need to sell a kindey to play at 4K they just need to buy a brain ! I don't know who needs dual GPU these days but what i do know is that if you value silence , thermals and power consumption there is absolutely no need to go dual GPU for 4K gaming especially nowadays .
RichF(Given AMD's success with Zen 2 chiplets I would expect that future is going to be multi-GPU, only the extra GPU chips will be chiplets.)
That's a reasonable expectation and a much more elegant solution ( in terms of architecture / noise / thermals ) than dual discrete GPU systems !
Posted on Reply
Add your own comment
Apr 25th, 2024 12:22 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts