• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Giving Up on CrossFire with RX Vega

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,356 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD is reportedly scaling down efforts on its end to support and promote multi-GPU technologies such as CrossFire, and not in favor of open-standards such as DirectX 12 native multi-GPU, either. Speaking to GamersNexus, an AMD representative confirmed that while the new Radeon RX Vega family of graphics cards support CrossFire, the company may not allocate as many resources as it used to with older GPU launches, in promoting or supporting it.

This is keeping up with trends in the industry moving away from multi-GPU configurations, and aligns with NVIDIA's decision to dial-down investment in SLI. This also more or less confirms that AMD won't build a Radeon RX series consumer graphics product based on two "Vega 10" ASICs. At best, one can expect dual-GPU cards for the professional or GPU-compute markets, such as the Radeon Pro or Radeon Instinct brands.



View at TechPowerUp Main Site
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
It was a driver solution (like SLI). AMD wants a hardware solution (through Infinity Fabric).
 
Joined
Feb 11, 2009
Messages
5,397 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> ... nope still the same :'(
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
well was it indeed not sorta the plan that DX12 etc can simple use multi gpu's inherently?
 

Toothless

Tech, Games, and TPU!
Supporter
Joined
Mar 26, 2014
Messages
9,269 (2.52/day)
Location
Washington, USA
System Name Veral
Processor 5950x
Motherboard MSI MEG x570 Ace
Cooling Corsair H150i RGB Elite
Memory 4x16GB G.Skill TridentZ
Video Card(s) Powercolor 7900XTX Red Devil
Storage Crucial P5 Plus 1TB, Samsung 980 1TB, Teamgroup MP34 4TB
Display(s) Acer Nitro XZ342CK Pbmiiphx + 2x AOC 2425W
Case Fractal Design Meshify Lite 2
Audio Device(s) Blue Yeti + SteelSeries Arctis 5 / Samsung HW-T550
Power Supply Corsair HX850
Mouse Corsair Nightsword
Keyboard Corsair K55
VR HMD HP Reverb G2
Software Windows 11 Professional
Benchmark Scores PEBCAK
well was it indeed not sorta the plan that DX12 etc can simple use multi gpu's inherently?
From both brands and nearly any GPU. Seems like NVIDIA and AMD are letting DX12 handle that and decided to get lazy.
 
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
well was it indeed not sorta the plan that DX12 etc can simple use multi gpu's inherently?

Even DX12 has bunch of pairing modes and I think they don't want either. Infinity Fabric is what they are pursuing on GPU's as well. It works really well with CPU's, so, naturally they want the same on GPU's. Quite frankly, I can't agree more with that. It has to be a hardware solution because software ones proved time and time again they are rubbish.
 
Joined
Mar 7, 2011
Messages
3,924 (0.82/day)
From both brands and nearly any GPU. Seems like NVIDIA and AMD are letting DX12 handle that and decided to get lazy.
Basically putting trust in hands of game developers who tend to be lazier than driver teams at either of those two GPU makers. We have seen far too many badly optimised PC ports of console parts anyways due to lazy developers.
 
Joined
Jul 13, 2016
Messages
2,828 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Even DX12 has bunch of pairing modes and I think they don't want either. Infinity Fabric is what they are pursuing on GPU's as well. It works really well with CPU's, so, naturally they want the same on GPU's. Quite frankly, I can't agree more with that. It has to be a hardware solution because software ones proved time and time again they are rubbish.

Multi-GPU has always been a problem because of the latency and that you have a CPU as a middle man.

If AMD is able to bring it's infinity fabric to GPUs it will completely change the market. The cost of making high end GPUs would go way down and it would be incredibly easy to make SKUs to fit every section of the market. The cost for AMD is simple and you need only look at the CPU market right now for a clue. If AMD can make a profit on the 1700 at $320, they most certainly are making a profit on the 1950x at $1,000, which is two Ryzen 1700 CPUs together. Unlike Intel, AMD doesn't have a massive die with poor yields. They have two smaller dies with near perfect yields. So many advantages to a modular CPU approach and I think AMD may only be scratching the surface of what we will see in the future.
 
Joined
Sep 15, 2007
Messages
3,944 (0.65/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
They can barely fix engine/game problems with drivers (which is NOT their job). There's no point extending that job for SLI/xfire, too. The devs are getting worse and worse, so there's no point.
 
Joined
Dec 16, 2010
Messages
1,662 (0.34/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, 2 x 512GB Samsung PM981a, 4 x 4TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
I think this is a good choice. My experiences with Crossfire have been frustrating at best. I'd rather they stop supporting the feature and reallocate their driver team elsewhere rather than continuing to advertise a feature that doesn't live up to its promise because they never invest enough development resources into it.

I now wonder how they are going to promote the ThreadRipper platform if Crossfire is deprecated. Yeah, 60 PCIe lanes is great, but how many do you need when you're limited to one GPU?
 
Last edited:
Joined
Oct 15, 2010
Messages
208 (0.04/day)
Ahhh crap, does this mean i wont be seeing a DUAL VEGA graphics Card ? that was kinda like what i was looking for !
 
Joined
Sep 15, 2007
Messages
3,944 (0.65/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
I think this is a good choice. My experiences with Crossfire have been frustrating at best. I'd rather they stop supporting the feature and reallocate their driver team elsewhere rather than continuing to advertise a feature that doesn't live up to its promise because they never invest enough development resources into it.

I now wonder how they are going to promote the ThreadRipper platform if Crossfire is deprecated. Yeah, 60 PCIe lanes is great, but how many do you need when you're limited to one GPU?

Direct the blame to devs. They're the ones not implementing it or riding the Nvidia gravy train. Look at games that are properly supported. Scaling is excellent.

You think AMD/Nvidia can't do dual GPUs just fine? They've were doing it for many years. I never had issues with 8800GT SLI or 4890 xfire (hell, 6950s were great, too). It really started going to shit when I had 7950s. By the time 290X rolled around, I'd had enough of worthless devs, so single card I went.
 

bug

Joined
May 22, 2015
Messages
13,213 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Well, after pushing the technology for over 10 years and not reaching any significant market penetration, what would you expect? That didn't stop people from crapping all over Nvidia when they dropped SLI from Pascal cards though.
 
Joined
May 13, 2008
Messages
664 (0.11/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
It was a driver solution (like SLI). AMD wants a hardware solution (through Infinity Fabric).

That's kind of been my sentiment on the sitch as well.

I don't know if 1*HBM2 stack makes sense for a ~Polaris level chip on 7nm (as opposed to simply 128-bit and GDDR6) and then scaling it that way as opposed to a Vega shrink (and then x2), but I most certainly expect that to be the case in some fashion. I like the idea of 4x because one chip (in essence what would be a ~75w+- card) could more-or-less simply be bolted onto an APU (for example a 65w Zen 2), but OTOH that ends up becoming questionable use of silicon at some point given each chip has to have things (like UVD etc) that one would think would be redundant (unless that also scaled, which would be kind of weird) and hence less efficient than a somewhat larger chip.

I don't have the time to look up the Polaris launch event where Raja *pretty much confirmed* that was happening, nor the fact Hynix's HBM slides have shown two AMD-looking dummy chips using their own stacks of HBM on one package since before even Fiji's release, but at the former he did indeed speak of MCM in a fashion in which he described it as both coming, but (at the time, now almost 1.5 years ago) they weren't quite there yet. Given the 'scaling' quality given as pretty much the only tease of Navi, it's not difficult to put 1+1 together...as it were. Clearly this has been in the works for a looong time.

IIRC Charlie even spoke of it eons ago...perhaps when he worked at The Inq or had just started S|A? I seem to recall an article from a long-long-long time ago by one of those OG insider cats that pretty much described that being the plan several years down the road and distinctly on the roadmap (I believe it was three generations from then at that point...perhaps around the time of kepler/Southern Islands and Eric Demers departure [ie when AMD stopped talking about interesting aspects of their engineering for a good long while]), but I could be wrong.

Either way...Sounds dope AF. :)

For those worried about it, even nVIDIA discussed the prospect openly as recently as 1 month ago:

http://research.nvidia.com/publication/2017-06_MCM-GPU:-Multi-Chip-Module-GPUs

(Note the last sentence of the summary in which nVIDIA confirms the 'old' way kind of sucks.)
 
Last edited:
Joined
Jun 1, 2007
Messages
150 (0.02/day)
Location
new jersey usa
I think most should be happy about amd officially saying they wont support it too much,for the last 5 or 6 years it has been crap.

I blame all the "oh dx12 is going to be so great take me w10"still nothing that made games look better just run better on older hardware.
so with all your trust in bs ms/game devs we have lost more in gaming then we gained.
 
Joined
May 13, 2008
Messages
664 (0.11/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
I think most should be happy about amd officially saying they wont support it too much,for the last 5 or 6 years it has been crap.

I blame all the "oh dx12 is going to be so great take me w10"still nothing that made games look better just run better on older hardware.
so with all your trust in bs ms/game devs we have lost more in gaming then we gained.


I think a large part of it is the divide between AMD using general purpose units (which could often exploit dx12) while nvidia has stuck to more efficient 'core' shader ratios + fixed function SFUs (similar to how we used to see 5+1 or 4+1 in old AMD arch, or 2 dual-issue MADD + MUL in early DX10 parts) largely satiated by DX11 in some cases more efficiently when built toward them, but that's just my opinion (which is largely backed up by Polaris' performance vs GP106). I would imagine that would be both of their arguments, anyway. AMD would blame nvidia for being slow to innovate/catch up to new techniques (which in the meantime penalizes AMD's designs for building towards it), which isn't either new or wrong...while nvidia would claim efficiency (given a SFU is tiny and one of AMD's CUs can only do 16 SFU ops iirc) as the reason, which also isn't wrong.

I often wonder if that will be part of nVIDIA's transition to a smaller process or 'innovative new arch' at some point. Instead of undershooting the ideal shader/rop ratio and making up for it with clockspeed, instead biting the bullet and going over the ideal compute/rop ratio which is greater than 224 but less than 256 for 4 ROPs; perhaps even sacrificing SFUs for more potential compute. While it may be necessary at some point, I bet they would both hate to vindicate what has essentially been AMD's design for the last several years as well as lose a competitive advantage that they *know* devs will program towards because of nvidia's market share.

I'm sure there are other reasons, but that's one that's always been stuck in my craw for whatever reason.
 
Joined
May 29, 2012
Messages
514 (0.12/day)
System Name CUBE_NXT
Processor i9 12900K @ 5.0Ghz all P-cores with E-cores enabled
Motherboard Gigabyte Z690 Aorus Master
Cooling EK AIO Elite Cooler w/ 3 Phanteks T30 fans
Memory 64GB DDR5 @ 5600Mhz
Video Card(s) EVGA 3090Ti Ultra Hybrid Gaming w/ 3 Phanteks T30 fans
Storage 1 x SK Hynix P41 Platinum 1TB, 1 x 2TB, 1 x WD_BLACK SN850 2TB, 1 x WD_RED SN700 4TB
Display(s) Alienware AW3418DW
Case Lian-Li O11 Dynamic Evo w/ 3 Phanteks T30 fans
Power Supply Seasonic PRIME 1000W Titanium
Software Windows 11 Pro 64-bit
That really seals the deal for me on whether to ever consider AMD GPUs. nVidia has scaled back SLI development, but they're still providing a really good amount of support for dual-SLI setups (which really, was the only one that could ever remotely be considered worth it). Not a driver update goes by where I don't see updated or new SLI profiles.

This is really sad to see AMD taking this position given all the high resolution, high refresh rate monitors we're getting these days - even a flagship GPU isn't enough to run a modern AAA game without losing frames and dropping below optimal FPS with 4K and high refresh rate monitors.

They were already only focusing on dual-CF setups so scaling back even farther from that level of support is disconcerting.
 
Joined
Mar 18, 2008
Messages
5,717 (0.97/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
So what's gonna populate those excessive amount of pcie slots then? Also, makes extra pcie lanes obsolete? Back to the AGP days?
 
Joined
May 18, 2009
Messages
2,744 (0.50/day)
Location
MN
System Name Personal / HTPC
Processor Ryzen 5900x / i5-4460
Motherboard Asrock x570 Phantom Gaming 4 /ASRock Z87 Extreme4
Cooling Corsair H100i / stock HSF
Memory 32GB DDR4 3200 / 8GB DDR3 1600
Video Card(s) EVGA XC3 Ultra RTX 3080Ti / EVGA RTX 3060 XC
Storage 500GB Pro 970, 250 GB SSD, 1TB & 500GB Western Digital / 2x 4TB & 1x 8TB WD Red, 2TB SSD & 4TB SSD
Display(s) Dell - S3220DGF 32" LED Curved QHD FreeSync Monitor / 50" LCD TV
Case CoolerMaster HAF XB Evo / CM HAF XB Evo
Audio Device(s) Logitech G35 headset
Power Supply 850W SeaSonic X Series / 750W SeaSonic X Series
Mouse Logitech G502
Keyboard Black Microsoft Natural Elite Keyboard
Software Windows 10 Pro 64 / Windows 10 Pro 64
I can't comment on Crossfire, but I never really had any issues with SLI going all the way back to the 7xxx series:
7600 GT
8800 GTS 640
8800 GTS 512 (Step-up Program from EVGA, traded in the 640MB models)
GTX 280
GTX 570
GTX 980Ti (though I've removed one card, putting it in another build. one handles all my gaming just fine, even at 5760x1080)

I can see that with the current hardware and advances from GPUs, a high-end current gen is very powerful. Most people don't game 4k with maxed/ultra settings and expect to pull 60fps. A high-end GPU on 2k is more than enough, at least it would be for me. It makes sense to scale back SLI/Crossfire support, considering it can be handled at the development level and not driver level on DX12. Now that AMD and Nvidia have made it common knowledge that SLI/Crossfire are not supported as much, devs really won't feel they need to support multi-GPUs in games now and won't put forth the time/money/effort to code for it in their games.
 
Joined
Aug 6, 2009
Messages
1,162 (0.22/day)
Location
Chicago, Illinois
They can barely fix engine/game problems with drivers (which is NOT their job). There's no point extending that job for SLI/xfire, too. The devs are getting worse and worse, so there's no point.
Yep, it has always been way too buggy since the beginning in my opinion to run SLI or Crossfire.
 
Joined
Aug 12, 2006
Messages
3,278 (0.51/day)
Location
UK-small Village in a Valley Near Newcastle
Processor I9 9900KS @ 5.3Ghz
Motherboard Gagabyte z390 Aorus Ultra
Cooling Nexxxos Nova 1080 + 360 rad
Memory 32Gb Crucial Balliastix RGB 4.4GHz
Video Card(s) MSI Gaming X Trio RTX 3090 (Bios and Shunt Modded) 2.17GHz @ 38C
Storage NVME / SSD RAID arrays
Display(s) 38" LG 38GN950-B, 27" BENQ XL2730Z 144hz 1440p, Samsung 27" 3D 1440p
Case Thermaltake Core series
Power Supply 1.6Kw Silverstone
Mouse Roccat Kone EMP
Keyboard Corsair Viper Mechanical
Software Windows 10 Pro
Pretty disappointing for me anyways. Been waiting on a viable upgrade from my 295x2 and 290x trifire. Hell even just a replacement for the 295x2. Vega was my hopes and just looks like a very very long wait for a disappointment. Had been hoping on a dual gpu vega.
 
Joined
Dec 28, 2012
Messages
3,475 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
So AMD pushes their new HDET chip with tons of PCIE lanes while simultaniously saying they will not support multiple GPUs going forward?

This from the same company that would not stop going on about how two 480s could beat a 1080?

Between this and VEGA's lackluster output, this just seems to confirm that AMD doesnt care much about GPUs anymore. We'll see how well it works out for them. So far, relying on devs for DX12 optimization and multi GPU has been a trainwreck. This is also going to put a dent in high refresh rate and high rez monitors. Multi GPU was the easiest way to get a 1440p144 monitor working at full speed. 4k144 will be a pipe dream for years if we have to rely on single GPU.
 
Joined
Sep 17, 2014
Messages
20,906 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
So AMD pushes their new HDET chip with tons of PCIE lanes while simultaniously saying they will not support multiple GPUs going forward?

This from the same company that would not stop going on about how two 480s could beat a 1080?

Between this and VEGA's lackluster output, this just seems to confirm that AMD doesnt care much about GPUs anymore. We'll see how well it works out for them. So far, relying on devs for DX12 optimization and multi GPU has been a trainwreck. This is also going to put a dent in high refresh rate and high rez monitors. Multi GPU was the easiest way to get a 1440p144 monitor working at full speed. 4k144 will be a pipe dream for years if we have to rely on single GPU.

- HEDT for gaming was never sensible, so I'm not sure what market AMD is missing there. And I applaud them for confirming that with this move.
- Multi GPU has always been troublesome and above all costly - why support something that is only a niche for a small part of the market, when you already have only like 20% market share left.
- DX12 and multi GPU isn't taking off because its now up to developers in other words its up to shareholders so never going to happen
- The dual 480 story... well let's just forget that quickly, and let's also forget about AMD and marketing, they'll never get married

No, honestly, let them focus on getting Navi up to scratch ASAP so we can actually move ahead, instead of focusing on worthless crap that never really ever worked as well as a single card and never really delivered more than a 10% perf/dollar win on the short term. I'll happily take my multi-GPU as a single board and let them solve their scaling and compatibility on a hardware level by just glueing the stuff together.

4k144 is a pipe dream anyway because there is not a single interface yet with sufficient bandwidth (I believe only DP 1.4 can do this?) and above all, high refresh gaming with multiple cards is latency fiesta so I really don't see the advantage here. You get FPS and you add latency and frame time variance issues by the truckloads.
 
Last edited:
Joined
Dec 28, 2012
Messages
3,475 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
- HEDT for gaming was never sensible, so I'm not sure what market AMD is missing there. And I applaud them for confirming that with this move.
- Multi GPU has always been troublesome and above all costly - why support something that is only a niche for a small part of the market, when you already have only like 20% market share left.
- DX12 and multi GPU isn't taking off because its now up to developers in other words its up to shareholders so never going to happen
- The dual 480 story... well let's just forget that quickly, and let's also forget about AMD and marketing, they'll never get married

No, honestly, let them focus on getting Navi up to scratch ASAP so we can actually move ahead, instead of focusing on worthless crap that never really ever worked as well as a single card and never really delivered more than a 10% perf/dollar win on the short term. I'll happily take my multi-GPU as a single board and let them solve their scaling and compatibility on a hardware level by just glueing the stuff together.
Given they just spent 3 years dressing up fiji and somehow managing to not improve over polaris in perf/watt, Somehow I doubt navi will be very good.

When is the last time AMD delivered on the GPU front, the 290x? Sure, then can glue, but if the arch falls further and further behind, they will only end up delaying the inevitable. AMD needs to get a modern GPU architecture out the gate at some point. The glue in ryzen works because per core ryzen is much closer to intel then any construction core. If ryzen was glued together bulldozer cores, it would still suck.

If navi is just gled together vega cores, it will suck. They need a proper base to build off of.
4k144 is a pipe dream anyway because there is not a single interface yet with sufficient bandwidth (I believe only DP 1.4 can do this?)
Bit of a contradiction there, isnt there? Is there no single interface, or is there a single interface?
 
Joined
Sep 17, 2014
Messages
20,906 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Given they just spent 3 years dressing up fiji and somehow managing to not improve over polaris in perf/watt, Somehow I doubt navi will be very good.

When is the last time AMD delivered on the GPU front, the 290x? Sure, then can glue, but if the arch falls further and further behind, they will only end up delaying the inevitable. AMD needs to get a modern GPU architecture out the gate at some point. The glue in ryzen works because per core ryzen is much closer to intel then any construction core. If ryzen was glued together bulldozer cores, it would still suck.

If navi is just gled together vega cores, it will suck. They need a proper base to build off of.

Bit of a contradiction there, isnt there? Is there no single interface, or is there a single interface?

Nah just wasn't too sure if DP 1.4 was already implemented on cards + monitors because you do need both, re the HDMI 2.0 issue.
 
Joined
Feb 16, 2017
Messages
476 (0.18/day)
Direct the blame to devs. They're the ones not implementing it or riding the Nvidia gravy train. Look at games that are properly supported. Scaling is excellent.

You think AMD/Nvidia can't do dual GPUs just fine? They've were doing it for many years. I never had issues with 8800GT SLI or 4890 xfire (hell, 6950s were great, too). It really started going to shit when I had 7950s. By the time 290X rolled around, I'd had enough of worthless devs, so single card I went.
But, but, but, but, but, DX12 is going to allow metal access and the devs are going to give us magic and, and...!

Oh right, it didn't pan out that way :eek:.

Nah just wasn't too sure if DP 1.4 was already implemented on cards + monitors because you do need both, re the HDMI 2.0 issue.
DP 1.4 can but it requires compression.
 
Top