Thursday, May 11th 2017

Linux Drivers Point to Upcoming AMD RX Vega Liquid-Cooled, Dual-GPU Solution

Linux patches have already given us a "lot" of information (using "lot" generously there) on AMD's upcoming Vega graphics cards. I'd wager few enthusiasts would be looking towards a dual-GPU solution anymore - not with the mostly absent support from most recent games, of which Prey is a notable exception. Not unless there was some sort of hardware feature that exposed both dies as a single GPU for games and software to handle, but I'm entering the realm of joyous, hopeful thinking here.

Back to the facts, a May 10th Linux patch has added two more device ID's to a Vega family of products: 0x6864 and 0x6868. These additions bring the total number of Vega device ID's to a healthy 9, which is still less than Polaris' 12. This is in-line with the expected number of SKUs for Vega, which should be less than those available for Polaris.
There are two particular lines of code that suggest the involvement of liquid cooling:
  • table->Tliquid1Limit = cpu_to_le16(tdp_table->usTemperatureLimitLiquid1);
  • table->Tliquid2Limit = cpu_to_le16(tdp_table->usTemperatureLimitLiquid2);
For me, this reads as some sort of temperature thresholds, and isn't a given for the existence of two chips. The "Tliquid1Limit" and "Tliquid2Limit" could point towards the same temperature threshold for two different GPUs, or two different thresholds for a single GPU, where these temperature limits trigger a check on a pre-configured TDP table with values for cooling curves, for example. However, there is one more detail that could give a little push towards the dual-GPU hypothesis, which is this line here:
  • table->FanGainPlx = hwmgr->thermal_controller. advanceFanControlParameters.usFanGainPlx;table->TplxLimit = cpu_to_le16(tdp_table->usTemperatureLimitPlx);
Now, Plx could mean PLX, as in, the ASIC bridge ship that is usually used to connect two different GPUs inside a single graphics card, circumventing PCIe lane restrictions usually found on platforms, and routing their signals to the PCIe slot. These chips actually do heat up as well, so it wouldn't be outside the realm of possible for a particular line of code that guarantees fan speed increase solely based on the PLX chip's temperature. However, these are just the most plausible interpretations for what is still unfortunately still shrouded in mystery. It's strange to see AMD so quiet in the short months still left for their Vega Q2 launch, but hopefully, we'll see something more tangible next week.
Sources: Phoronix, ETeknix, WCCFTech
Add your own comment

51 Comments on Linux Drivers Point to Upcoming AMD RX Vega Liquid-Cooled, Dual-GPU Solution

#1
mickel116
nobody cares about a dual VEGA? i find it very interesting.
300 TDP, liquid cooled. ~20 Tf, because of the lower clocks..
Posted on Reply
#3
springs113
hopefully they find a way to get it to read as one single gpu chip.
Posted on Reply
#5
phanbuey
mickel116nobody cares about a dual VEGA? i find it very interesting.
300 TDP, liquid cooled. ~20 Tf, because of the lower clocks..
Dual gpu tho...
Posted on Reply
#6
mickel116
phanbueyDual gpu tho...
how long was the 295X2 the fastest card around? i bet it's still a beast!
Posted on Reply
#7
Ferrum Master
mickel116how long was the 295X2 the fastest card around? i bet it's still a beast!
Aga sure... ask @RCoon
Posted on Reply
#8
ratirt
springs113hopefully they find a way to get it to read as one single gpu chip.
Actually they did find a way. It is mainly because the production of smaller cores on a wafer is more profitable. There are always some defects that come with the silicon and cores are being either cut down or thrown away(this applies to the CPU area as those defected 1800x for example come out as 1500x or 1600x same goes with intel BTW). Smaller means more cores 100% working and there is a way, which AMD already mentioned, that 2 dies can be seen as a single one. I need to find a video about it and I shall put it here. It does explain some stuff and from the company perspective that would be great if they could minimize loss of defected cores on the silicon delivering very powerful GPU's
Posted on Reply
#9
Joss
ratirtthere is a way, which AMD already mentioned, that 2 dies can be seen as a single one
I hope. Otherwise it doesn't make sense, with so many new games not supporting multi GPUs.
Posted on Reply
#10
RejZoR
mickel116how long was the 295X2 the fastest card around? i bet it's still a beast!
I actually tiny bit regret not going with R9 295X2. I could get a new R9 295X2 for the price of also new GTX 980. Even when CrossX wouldn't even work, it would still be basically GTX 980 level, but with fully working one, it's easily GTX 1070 and beyond level. I think I didn't go with it back then because I had miniATX case and additional AiO was a problem (already had one for CPU).

I think Dual Vega will basically be an answer to upcoming Volta and a tool to beat current NVIDIA's Titan Xp. If what you guys say of showing multiGPU to games as single GPU, if that is true and they have the means to somehow achieve that, that's pretty awesome.
Posted on Reply
#11
RCoon
mickel116how long was the 295X2 the fastest card around? i bet it's still a beast!
Ferrum MasterAga sure... ask @RCoon
A beast... when it works. Ideal for 4K gaming, it could push the frames needed for a smooth 60FPS experience on damn near any game (no AA enabled). Unfortunately dual GPUs aren't getting love from developers anymore, so XFire profiles are dying out fast. I spent the last year of my time with the 295x2 with one chip fully loaded and the second taking on the role of a $500 paperweight. There was little overclocking room (and I had my fully water cooled), and it really did pump out A LOT of heat into the room. I genuinely never had my heating turned out throughout the year in that room, I just used my PC a little more in winter.

It was a blast while it lasted, but it's really not a "sensible" option for pushing frames - as you can now tell from the fact I sold it on and went for a 1050ti.

Dual GPU cards are dead imo, they get very little support, and hinder themselves due to design constraints. At this stage I'd not only not recommend a single card dual gpu option, I just straight up wouldn't recommend bothering with SLI or Crossfire anymore. Too little support.
Posted on Reply
#12
the54thvoid
Intoxicated Moderator
Until the gaming software environment is rewritten to be dual GPU friendly with good scaling and no stutter or latency issues, dual GPU is the lazy way to say you have the fastest card. Minimum fps means alot and dual GPU's from both camps always had that issue. Sure, if a game works well it's great but in most cases they don't work well enough.

Always buy the fastest single gpu you can afford for trouble free gaming.

For software development where the latency might be less of an issue (VR - like Radeon Pro), sure it's viable.
Posted on Reply
#13
Caring1
Raevenlord
What GPU is this?
It doesn't appear to be liquid cooled, as it has provision for two 4 pin PWM fans, and it also has a dual BIOS switch, the chips are marked E.S. too.
Posted on Reply
#14
ShurikN
Caring1What GPU is this?
It doesn't appear to be liquid cooled, as it has provision for two 4 pin PWM fans, and it also has a dual BIOS switch, the chips are marked E.S. too.
Radeon pro duo maybe
Posted on Reply
#15
springs113
RCoonA beast... when it works. Ideal for 4K gaming, it could push the frames needed for a smooth 60FPS experience on damn near any game (no AA enabled). Unfortunately dual GPUs aren't getting love from developers anymore, so XFire profiles are dying out fast. I spent the last year of my time with the 295x2 with one chip fully loaded and the second taking on the role of a $500 paperweight. There was little overclocking room (and I had my fully water cooled), and it really did pump out A LOT of heat into the room. I genuinely never had my heating turned out throughout the year in that room, I just used my PC a little more in winter.

It was a blast while it lasted, but it's really not a "sensible" option for pushing frames - as you can now tell from the fact I sold it on and went for a 1050ti.

Dual GPU cards are dead imo, they get very little support, and hinder themselves due to design constraints. At this stage I'd not only not recommend a single card dual gpu option, I just straight up wouldn't recommend bothering with SLI or Crossfire anymore. Too little support.
Sort of the dilemma I'm facing...my heart is telling me to get that 1000w Seasonic Titanium just in case I Xfire and get that AMD HEDT, but my head is telling me get the 850w model instead and be done with.

Edit: I'm running a 290x+290 xfire set up atm...with the games that scale, its so nice(Crysis 3, Tomb Raider) while others (Gears 4, Far Cry) it sucks. Although there is support for gears, I don't have any scaling whatsoever. Granted it maybe due to the fact of combo I have running but it's a no go.
Posted on Reply
#16
Caring1
ShurikNRadeon pro duo maybe
That's it, once I had a name I could find pics all over the place for it.
Posted on Reply
#17
ERazer
hopefully this isnt the 1080ti equivalent that everyone is buzzing about...
Posted on Reply
#18
springs113
ERazerhopefully this isnt the 1080ti equivalent that everyone is buzzing about...
Being somewhat of an optimist, I'm gonna go out on a limb and say nope. I think this is the Volta tide over.
Posted on Reply
#19
Xzibit
springs113Being somewhat of an optimist, I'm gonna go out on a limb and say nope. I think this is the Volta tide over.
It was part of their slide presentation



Volta is 12nm, the tie over would be Vega 20 on 7nm.

Posted on Reply
#20
uuuaaaaaa
springs113hopefully they find a way to get it to read as one single gpu chip.
That will happen when Navi arrives I think.
Posted on Reply
#21
Liviu Cojocaru
If this will be the competitor for the 1080Ti then that would be just sad...I really want AMD to succeed but I don't think dual GPU's is the way in 2017 and it probably won't be in the future either
Posted on Reply
#22
springs113
@Xzibit I said somewhat lol.
@uuuaaaaaa We need it now.
@Liviu Cojocaru I don't know about that future part. It's only a matter of time before GPUs reach that wall that we are starting to see in the CPU realm.
Posted on Reply
#23
Frick
Fishfaced Nincompoop
the54thvoidUntil the gaming software environment is rewritten to be dual GPU friendly with good scaling and no stutter or latency issues, dual GPU is the lazy way to say you have the fastest card. Minimum fps means alot and dual GPU's from both camps always had that issue. Sure, if a game works well it's great but in most cases they don't work well enough.
I was kinda stoked about AMDs APU+GPU scheme. For low end systems it would be really nice of you could buy a cheap GPU and boost your 30-40FPS to 45-60FPs, or allow you to scale up the settings/resolution a few notches. But alas it is not to be, and it fell on the fact that the good APUs cost as much as a midrange i3/almost i5, and that the CPU parts were Bulldozer derivatives.

EDIT: Thinking back it was a tech out of sync with time. Integrated graphics needs fast memory, and that is hard to pull off on a budget. I am dying to see what the Zen APUs will be, and it would be so swell if you could pair them with the RX550.
Posted on Reply
#24
uuuaaaaaa
springs113@Xzibit I said somewhat lol.
@uuuaaaaaa We need it now.
@Liviu Cojocaru I don't know about that future part. It's only a matter of time before GPUs reach that wall that we are starting to see in the CPU realm.
Since this new dual vega still uses a PLX bridge chip, I doubt it... With this said, it is known that vega supports the infinity fabric thing, so if is used in conjunction with their Ryzen platforms it may be possible, but this is speculation at this point...
Posted on Reply
#25
springs113
Rumor has it, May 31 before we get any real info.
Posted on Reply
Add your own comment
Apr 26th, 2024 07:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts