• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Linux Drivers Point to Upcoming AMD RX Vega Liquid-Cooled, Dual-GPU Solution

Rumor has it, May 31 before we get any real info.
 
Buh bye nvidia compute cards. See ya in the trash can.
 
What GPU is this?
It doesn't appear to be liquid cooled, as it has provision for two 4 pin PWM fans, and it also has a dual BIOS switch, the chips are marked E.S. too.

Hey Caring1, it was just a pic showing off one such solution with an integrated PLX chip. Sadly, not actually a picture of the referenced card :p
 
Looks like those 1st April fake news turns out to be a reality lol. That one had everything we've seen so far actually listed.
 
Looks like those 1st April fake news turns out to be a reality lol. That one had everything we've seen so far actually listed.

It would be epic if those slides actually turned out to be true xD
 
Rage with Liquid cooling for angre management
 
A beast... when it works. Ideal for 4K gaming, it could push the frames needed for a smooth 60FPS experience on damn near any game (no AA enabled). Unfortunately dual GPUs aren't getting love from developers anymore, so XFire profiles are dying out fast. I spent the last year of my time with the 295x2 with one chip fully loaded and the second taking on the role of a $500 paperweight. There was little overclocking room (and I had my fully water cooled), and it really did pump out A LOT of heat into the room. I genuinely never had my heating turned out throughout the year in that room, I just used my PC a little more in winter.

It was a blast while it lasted, but it's really not a "sensible" option for pushing frames - as you can now tell from the fact I sold it on and went for a 1050ti.

Dual GPU cards are dead imo, they get very little support, and hinder themselves due to design constraints. At this stage I'd not only not recommend a single card dual gpu option, I just straight up wouldn't recommend bothering with SLI or Crossfire anymore. Too little support.
Well not really. The thing is and that's an obvious that GPU manufacturers struggle with moor's law. I mean they are getting the cores shrinks but they may or may not hit the wall at some point. Manufacturing CPU's or GPU's is a nasty thing. There's so many fails with the silicon on the wafer. The bigger the die then more GPU chips are useless. That is why they shrink the dies to get better performance and also more dies on the wafer. They don't lose that much money and you can imaging how expensive it is to produce those. Smaller dies on the other hand can be as fast as bigger if you combine their power.
AMD is speculating maybe but they kinda want to force developers for dual GPU support. Even NV wouldn't mind. They could sell cheaper chips have more perfect on the wafer and make decent GPU's. As a matter of fact there is a technology that allows 2 GPU's being seen as one. I need to find the video. It might be horse shit but it was pretty interesting and when i thought about it make sense. DX12 shows that this might be true. Maybe it isn't the best solution for dual GPU's but it's there

EDIT:
I found the video. I don't know if you guys seen it but it really explain some stuff. I encourage everybody to watch and say what you think if you haven't seen it yet.
 
Last edited:
Too bad only ~30% of Games out there support proper Cross Fire implementation.
Also FreeSync not working with Crossfire I think....
 
how long was the 295X2 the fastest card around? i bet it's still a beast!

It was DOA, scaling issues in many games and the one's that it did 290x CF was faster and imo better suited to dealing with thermals.
 
After 1 year, AMD finally makes a dual rx480 and just call it RX Vega and we will hear some comments like All Future Games will use DX12 or Vulkan or supporting Multi GPU.
 
After 1 year, AMD finally makes a dual rx480 and just call it RX Vega and we will hear some comments like All Future Games will use DX12 or Vulkan or supporting Multi GPU.
There is no such thing as Dual Vega based on a rx480 core. Where do you get this crap and feed the public with it?

EDIT. How stupid this sounds? Vega is something totally new and saying 2x rx480 is just stupid.
Same as saying that Ryzen CPU is 2x Bulldozer.
 
Last edited:
It will be like Fiji/Fury all over again. New Radeon Pro Duo with Vega chips (16GB HBM2 in total) and Vega X with 4096 cores and Vega with 3584 activated. Probably another Nano with full activated chip but lower tdp/clocks too. This will be fun. Fiji were the most interesting chips back then and Vega will probably be like that again.

Ps. Crossfire supports FreeSync.
 
It will be like Fiji/Fury all over again. New Radeon Pro Duo with Vega chips (16GB HBM2 in total) and Vega X with 4096 cores and Vega with 3584 activated. Probably another Nano with full activated chip but lower tdp/clocks too. This will be fun. Fiji were the most interesting chips back then and Vega will probably be like that again.

Ps. Crossfire supports FreeSync.
I really don't think AMD will release something like pro duo with vega cores. Maybe with low end Vega(since there will be few versions of Vega) if you even can say that but that's unlikely. Just my opinion but who knows.
 
I really don't think AMD will release something like pro duo with vega cores. Maybe with low end Vega(since there will be few versions of Vega) if you even can say that but that's unlikely. Just my opinion but who knows.
Why would they put it on the drivers then?
 
I really don't think AMD will release something like pro duo with vega cores. Maybe with low end Vega(since there will be few versions of Vega) if you even can say that but that's unlikely. Just my opinion but who knows.
It's very probable that they will, because they did it for every generation since HD 5000 series now (for their high end gpus I'm not talking about Polaris). Also look at those pictures. They could reuse the Radeon Pro Duo design which worked very well.
 
say what you think

GPU manufacturers don't want to support multi-GPU setups anymore, and developers don't want to support multi-GPU setups anymore. Honestly, fewer people are supporting/using SLI and Crossfire for games. It's fast going the route where people only use them for compute power.
 
GPU manufacturers don't want to support multi-GPU setups anymore, and developers don't want to support multi-GPU setups anymore. Honestly, fewer people are supporting/using SLI and Crossfire for games. It's fast going the route where people only use them for compute power.
Radeon Pro Duo never was a real gaming gpu anyway it was more or less intended for work with gaming being a "bonus" mostly. It wasn't marketed as a gaming card and never tested here and on many other websites as well as AMD didn't really sent out samples of it - maybe toms got one, not sure, but that's it.

So it's easily possible they could do another RPD with Vega GPUs.
 
Radeon Pro Duo never was a real gaming gpu anyway it was more or less intended for work with gaming being a "bonus" mostly. It wasn't marketed as a gaming card and never tested here and on many other websites as well as AMD didn't really sent out samples of it - maybe toms got one, not sure, but that's it.

So it's easily possible they could do another RPD with Vega GPUs.

I've heard reports of some guys getting insane clocks out of the Fiji XT cores on the Pro duo under LN2, maybe those cores were better binned too?
 
I've heard reports of some guys getting insane clocks out of the Fiji XT cores on the Pro duo under LN2, maybe those cores were better binned too?
Probably the opposite because they used low energy "Nano" suited cores for RPD as it was made to be efficient around 375W TDP maximum at up to 1000MHz each. But of course it comes down to chip lottery as always to be able to have a great overclocker. Best binned gpus for Fiji line usually were those in Fury X but I saw one guy do some insane clocks on Fury (1500mhz) and as well HBM (1000mhz - 1 TB/s bandwidth) via ln2 using the Asus Strixx.
 
Probably the opposite because they used low energy "Nano" suited cores for RPD as it was made to be efficient around 375W TDP maximum at up to 1000MHz each. But of course it comes down to chip lottery as always to be able to have a great overclocker. Best binned gpus for Fiji line usually were those in Fury X but I saw one guy do some insane clocks on Fury (1500mhz) and as well HBM (1000mhz - 1 TB/s bandwidth) via ln2 using the Asus Strixx.

It was Xtreme Addict:

http://forum.hwbot.org/showthread.php?t=142320

My strix struggles to do 1100MHz through firestrike, even tho I unlocked the shaders to 4096 and customized the bios to allow 375W/325A. 1050MHz is fine at stock volts is fine tho, the vdroop on this card became bad after I damaged the high side caps :/
 
Wonder if they call it Radeon Pro Duo... I'm kind of sure it won't be a consumer card.

Edit. Or maybe it's the new Project Quantum. What ever happened to it?
 
Yeah that guy, that was a nice read. Never saw something better @ Fiji.
My strix struggles to do 1100MHz through firestrike, even tho I unlocked the shaders to 4096 and customized the bios to allow 375W/325A. 1050MHz is fine at stock volts is fine tho, the vdroop on this card became bad after I damaged the high side caps :/
Fiji generally is a struggle of some kind. I like Fiji, but for me it never was more than a HBM-experiment and a unbalanced GPU for too much money, 980 Ti was simply too good to be passed.
Wonder if they call it Radeon Pro Duo... I'm kind of sure it won't be a consumer card.

Edit. Or maybe it's the new Project Quantum. What ever happened to it?
Maybe the same name yeah.

Project Quantum I suspect was just a showcase, I don't know if they still plan to release it, but it would be nice.
 
  • table->FanGainPlx = hwmgr->thermal_controller. advanceFanControlParameters.usFanGainPlx;table->TplxLimit = cpu_to_le16(tdp_table->usTemperatureLimitPlx);
Now, Plx could mean PLX, as in, the ASIC bridge ship that is usually used to connect two different GPUs inside a single graphics card, circumventing PCIe lane restrictions usually found on platforms, and routing their signals to the PCIe slot. These chips actually do heat up as well, so it wouldn't be outside the realm of possible for a particular line of code that guarantees fan speed increase solely based on the PLX chip's temperature. However, these are just the most plausible interpretations for what is still unfortunately still shrouded in mystery. It's strange to see AMD so quiet in the short months still left for their Vega Q2 launch, but hopefully, we'll see something more tangible next week.
I think that's a stretch. The only thing inside of the linux kernel that makes a reference to this is literally these two lines you posted:
Code:
table->FanGainPlx = hwmgr->thermal_controller.
advanceFanControlParameters.usFanGainPlx;
There is no other reference to this and this is for all Vega GPUs according to where this is being called in the kernel. It's possible that a Vega might have a PLX chip however, it seems more likely that this sensor will just have an undefined value for all Vegas without a PLX chip with this kind of thermal sensor support. It's also odd that this is the only place that this name is referenced so, I think this is just to capture the value in case the value is ever used in the future. That does not necessarily indicate that Vega is going to have a dual GPU solution (even more so since AMD's multi-gpu support in Linux is practically non-existent.)

It is much more likely that these lines on the smu9_driver_if.h are an indicator of a multi-GPU option but, once again, this only means that there might be a multi-GPU board, not that there will be since these seem to be mostly placeholders than anything else.
Code:
/* Gemini Modes */
#define PPSMC_GeminiModeNone 0 /* Single GPU board */
#define PPSMC_GeminiModeMaster 1 /* Master GPU on a Gemini board */
#define PPSMC_GeminiModeSlave 2 /* Slave GPU on a Gemini board */
 
Just wishful thinking... but maybe they figured out a way to have two GPUs act as one and share a huge pool of ram. They did say something about new memory system....
 
Back
Top