• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Partner Prepares Dual Arc "Battlemage" B580 GPU with 48 GB of VRAM

Nvidia made dual GPUs too, going way back.
 
The Voodoo 5 6000 sees this thread and waves its hand at us. Four hands, actually.

https://cdn.mos.cms.futurecdn.net/GGgwYKynHxEJfjAkBtKgmF-650-80.jpg.webp
 
Nvidia made dual GPUs too, going way back.
Yep you are right. Even though the author mentions the Titan Z twice, there was nothing special about it. In addition to the ones I listed after the Titan Z, here are the dual GPUs before the Titan Z.

Geforce 7950 X2 2006
Geforce 9800 GX2 2008
Radeon HD 3850 & 3870 X2 2008
Radeon HD 4850 & 4870 X2 2009
Geforce GTX 295 2009
Radeon HD 5970 2009
Radeon HD 6990 2011
Geforce GTX 590 2011
Geforce GTX 690 2012
Radeon HD 7990 2013
Radeon HD 8990 2013

The Titan Z and R9 295X2 were just the last in a long line of consumer dual GPUs. AMD released a few workstation dual GPUs afterwards as stated in my last comment but then the trend fell out of favor.
 
The Voodoo 5 6000 sees this thread and waves its hand at us. Four hands, actually.

https://cdn.mos.cms.futurecdn.net/GGgwYKynHxEJfjAkBtKgmF-650-80.jpg.webp

I feel like you've just shown us a AI depiction of a Quad Gen 5 M.2 device with cooling woes...lol stop deep faking us with your voodoo 3DFX.
 
I already noted that it is a lazy photoshop. Not a real render.

intel-insane.jpg
 
I decree thee.. the BBBBB555558888800000.

I feel like you've just shown us a AI depiction of a Quad Gen 5 M.2 device with cooling woes...lol stop deep faking us with your voodoo 3DFX.
No, its actually real apparently. Voodoo had alot of crazy stuff in the works before they died. Linus did a video about one of their unreleased, and very rare GPU's. Will link it alter.
 
Multiple dual GPU cards from AMD were released before and since the Titan Z so I'm not sure why only the Titan is mentioned as something people miss. Here are the dual AMD cards that came after the Titan Z:

Titan Z 2014
R9 295X2 2014
Radeon Pro Duo (Fuji) 2016
Radeon Pro Duo (Polaris 10) 2017
Radeon Pro Vega II Duo 2019
Radeon Pro W6800X Duo 2021 - most notably as two of these were in the last Intel Mac Pro for four GPUs total
Voodoo had multiple GPUs on a card. Even before Titan there were dual 7800s, dual 6800s, and dual 6600s on the nvidia side. But it was only in the 3dfx voodoo days that they were worth a damn.

Now as GPUs are are for computing it might be worth it though.
 
Yep you are right. Even though the author mentions the Titan Z twice, there was nothing special about it. In addition to the ones I listed after the Titan Z, here are the dual GPUs before the Titan Z.

Geforce 7950 X2 2006
Geforce 9800 GX2 2008
Radeon HD 3850 & 3870 X2 2008
Radeon HD 4850 & 4870 X2 2009
Geforce GTX 295 2009
Radeon HD 5970 2009
Radeon HD 6990 2011
Geforce GTX 590 2011
Geforce GTX 690 2012
Radeon HD 7990 2013
Radeon HD 8990 2013

The Titan Z and R9 295X2 were just the last in a long line of consumer dual GPUs. AMD released a few workstation dual GPUs afterwards as stated in my last comment but then the trend fell out of favor.
Those cards had unified memory or at least some sort of fast link between the processor chips, I guess - is that correct? But this double B580 will be just two independent GPUs on one PCB.
 
Those cards had unified memory or at least some sort of fast link between the processor chips, I guess - is that correct? But this double B580 will be just two independent GPUs on one PCB.
I think many of these cards were called Crossfire or SLI on a single board. There was a PCIe root complex chip between the two GPUs. So the dual GPU cards operated exactly like two separate GPU boards in Crossfire or SLI configuration. The only benefit over two separate cards was space savings in your case. A game had to support Crossfire and/or SLI in order to get any benefit from these dual GPU boards. Ultimately for that reason these cards fell out of favor.

Here is a picture from TPUs review of the 295 X2:

1747137531982.png


PLX provided the PCIe bridge chip. It doesn't seem like a unified memory architecture.

 
They don't add up the VRAM - each card has exactly that what it's given. So a 8GB card means a 2x 4GB usable setup.
 
That's how it was handled for SLI/CF and sort of before DX12 changes. That was a byproduct of how it treated GPU usage though to do split screen rendering. Technically you could devote a GPU to upscale or do frame gen or perform physics or run AI or post process and leverage it's own VRAM for those things then relay the just copy the frame buffer back to the other GPU to output which if it's using CXL and shared unified memory pooling would be a smoother process. It could be handled more like Intel handles the L cache on E cores.
 
Geforce 7950 X2 2006
Geforce 9800 GX2 2008
Radeon HD 3850 & 3870 X2 2008
Radeon HD 4850 & 4870 X2 2009
Geforce GTX 295 2009
Radeon HD 5970 2009
Radeon HD 6990 2011
Geforce GTX 590 2011
Geforce GTX 690 2012
Radeon HD 7990 2013
Radeon HD 8990 2013
+ Ignoring the pre 2010 cards, the GTX 760x2 is also missing on that list.
 
+ Ignoring the pre 2010 cards, the GTX 760x2 is also missing on that list.
I also didn't talk about Aleksandark's statement about the Titan Z being impressive with 6 GB of memory. A year before the Titan Z, the Radeon HD 7990 had 6 GB of memory in 2013. In the same year as the Titan Z (2014), the Radeon R9 295X2 had 8 GB of memory.

Edit, actually it looks like the Titan Z had 2 x 6 GB of memory for a total of 12 GB of memory. So that is impressive for the time and the article should be updated to say 12 GB instead of 6 GB.
 
Since AMD is chopped liver....


And from a quick search:

AMD has offered several dual-GPU cards over the years, primarily using their CrossFire technology. Here's a list, including both consumer and professional models:
Consumer Cards:
  • Radeon HD 3870x2: A dual-GPU card based on the HD 3870.

  • Radeon HD 4850x2: A dual-GPU card based on the HD 4850.

  • Radeon HD 4870x2: A dual-GPU card based on the HD 4870.

  • Radeon HD 5870x2: A dual-GPU card based on the HD 5870.

  • Radeon HD 6990: A high-end dual-GPU card using two Radeon HD 6970 GPUs.

  • Radeon R9 295X2: A very high-end dual-GPU card using two Radeon R9 290X GPUs.
Professional Cards:
  • Radeon Pro V340: A dual-GPU card based on Vega 56.
  • Radeon Pro Vega II Duo: A dual-GPU card based on the Radeon VII.
  • Radeon Pro W6800X Duo: A dual-GPU card based on Radeon RX 6800.
Other AMD Dual-GPU Technologies:
 
Since AMD is chopped liver....


And from a quick search:

AMD has offered several dual-GPU cards over the years, primarily using their CrossFire technology. Here's a list, including both consumer and professional models:
Consumer Cards:
  • Radeon HD 3870x2: A dual-GPU card based on the HD 3870.

  • Radeon HD 4850x2: A dual-GPU card based on the HD 4850.

  • Radeon HD 4870x2: A dual-GPU card based on the HD 4870.

  • Radeon HD 5870x2: A dual-GPU card based on the HD 5870.

  • Radeon HD 6990: A high-end dual-GPU card using two Radeon HD 6970 GPUs.

  • Radeon R9 295X2: A very high-end dual-GPU card using two Radeon R9 290X GPUs.
Professional Cards:
  • Radeon Pro V340: A dual-GPU card based on Vega 56.
  • Radeon Pro Vega II Duo: A dual-GPU card based on the Radeon VII.
  • Radeon Pro W6800X Duo: A dual-GPU card based on Radeon RX 6800.
Other AMD Dual-GPU Technologies:
Yeah, I haven't posted in a while as the Nvidia love fest has gotten out of control but I couldn't resist trying to correct the revisionist history of this article. The Titan Z is no more impressive than the many other solutions from 3DFX, ATI/AMD and Nvidia before it was released as many other commenters have pointed out. It will be interesting to see if Intel releases such a solution.
 
I wonder if there will be any kind of interconnect between those 2 GPUs, or if they'll just act totally independent.

I haven't played around with Intel GPUs, if they are any good for compute then this might be a worthwhile competitor to the 3090 for the VRAM hungry LLM crowd.
That's exactly what I thought the moment I saw it. I'd instantly buy it if I managed to find it in the $1.5k range. 2 of those for 96GB of VRAM? Makes me wet my pants.
That would make a GREAT inference card for smaller-ish models, since its performance is good enough for those, and I'd be able to leave multiple models running at once without unloading them from memory.

Seems like a gimmick, similar to SLI and crossfire, but

1./ https://www.intel.com/content/www/us/en/support/articles/000093897/graphics.html#primary-content , and
2./ Deep Link has just been cancelled.

So I don't get it. It would need to be designed very differently in order to benefit from a contiguous 48GB rather than just a mirrored 24GB which performance wise is easily beaten by a higher nV card.
I don't think such product is meant for the consumer market whatsoever.
Many compute platforms can make use of both GPUs without issues.
 
Yeah, I haven't posted in a while as the Nvidia love fest has gotten out of control
Same here.
but I couldn't resist trying to correct the revisionist history of this article.
Been fighting that war for a long time, but money talks louder.
The Titan Z is no more impressive than the many other solutions from 3DFX, ATI/AMD and Nvidia before it was released as many other commenters have pointed out.
It is that impressive, I mean, the brand Ngreedia alone provides their devices with magical powers! :D
It will be interesting to see if Intel releases such a solution.
I do wonder if with how much the industry has advanced if maybe they should try again.

I mean, seems that multichips designs are not good for GPUs and monolithic GPUs cant simply keep growing, so maybe multiple gpus in one board would be a better solution?
 
Last edited:
I mean, seems that multichips designs are not good for GPUs and monolithic GPUs cant simply keep growing, so maybe multiple gpus in one board would be a better solution?
That's already being the norm in the enterprise scenario.
Nvidia's GB200 has two B200s on board, and each B200 has two dies, similar to what AMD has been doing for quite some time with their MCM designs.

I don't think this will really take place in the consumer space tho.
 
hmm, is there a way to install system on vram ? :D (It's just loose thought :) )
But nevertheless it's interesting.
I looked at it, and I've found that it is Photoshop cause of the lanes from GPU to bottom, and imprints of the thermal-pads :D
Edit, If You could add info about photoshop into the picture itself, cause it could add misinterpretation.

Years ago someone managed to install Crysis 3 onto the GPU using a RAMdrive partition: https://www.techspot.com/news/86981-someone-installed-crysis-3-geforce-rtx-3090-vram.html
 
I wonder if there will be any kind of interconnect between those 2 GPUs, or if they'll just act totally independent.


That's exactly what I thought the moment I saw it. I'd instantly buy it if I managed to find it in the $1.5k range. 2 of those for 96GB of VRAM? Makes me wet my pants.
That would make a GREAT inference card for smaller-ish models, since its performance is good enough for those, and I'd be able to leave multiple models running at once without unloading them from memory.


I don't think such product is meant for the consumer market whatsoever.
Many compute platforms can make use of both GPUs without issues.

I'd presume if they want to do it better CXL, but they might still do something like incorporate TB. It would be trivial to incorporate 1 or 2 TB ports to connect them internally or externally or both. It's possible they could even daisy chain configurations between PC's with TB eventually.
 
Wow, I didn't realize we were back in the 2010's.

Now I know this is not meant for gaming (Though I will be curious to see if it can be made to work in some way reasonably well for....Reasons...) and its really for AI, but it does seem like a blast from the past either way. I used to run dual GPU solutions all the time (Loved my HD 6990's in Quadfire) and it was a fun time. I do miss doing that as it was fun to see systems with crazy multi-GPU cooling! (Again I know this is not meant for gaming!)
 
I don't think this will really take place in the consumer space tho.
Enterprise needs density and can pay for it (and that applies to HBM as well). So yeah, I agree.
 
Back
Top