• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Linux Drivers Point to Upcoming AMD RX Vega Liquid-Cooled, Dual-GPU Solution

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.18/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
Linux patches have already given us a "lot" of information (using "lot" generously there) on AMD's upcoming Vega graphics cards. I'd wager few enthusiasts would be looking towards a dual-GPU solution anymore - not with the mostly absent support from most recent games, of which Prey is a notable exception. Not unless there was some sort of hardware feature that exposed both dies as a single GPU for games and software to handle, but I'm entering the realm of joyous, hopeful thinking here.

Back to the facts, a May 10th Linux patch has added two more device ID's to a Vega family of products: 0x6864 and 0x6868. These additions bring the total number of Vega device ID's to a healthy 9, which is still less than Polaris' 12. This is in-line with the expected number of SKUs for Vega, which should be less than those available for Polaris.





There are two particular lines of code that suggest the involvement of liquid cooling:
  • table->Tliquid1Limit = cpu_to_le16(tdp_table->usTemperatureLimitLiquid1);
  • table->Tliquid2Limit = cpu_to_le16(tdp_table->usTemperatureLimitLiquid2);

For me, this reads as some sort of temperature thresholds, and isn't a given for the existence of two chips. The "Tliquid1Limit" and "Tliquid2Limit" could point towards the same temperature threshold for two different GPUs, or two different thresholds for a single GPU, where these temperature limits trigger a check on a pre-configured TDP table with values for cooling curves, for example. However, there is one more detail that could give a little push towards the dual-GPU hypothesis, which is this line here:

  • table->FanGainPlx = hwmgr->thermal_controller. advanceFanControlParameters.usFanGainPlx;table->TplxLimit = cpu_to_le16(tdp_table->usTemperatureLimitPlx);

Now, Plx could mean PLX, as in, the ASIC bridge ship that is usually used to connect two different GPUs inside a single graphics card, circumventing PCIe lane restrictions usually found on platforms, and routing their signals to the PCIe slot. These chips actually do heat up as well, so it wouldn't be outside the realm of possible for a particular line of code that guarantees fan speed increase solely based on the PLX chip's temperature. However, these are just the most plausible interpretations for what is still unfortunately still shrouded in mystery. It's strange to see AMD so quiet in the short months still left for their Vega Q2 launch, but hopefully, we'll see something more tangible next week.



View at TechPowerUp Main Site
 
nobody cares about a dual VEGA? i find it very interesting.
300 TDP, liquid cooled. ~20 Tf, because of the lower clocks..
 
This could be interesting.
 
hopefully they find a way to get it to read as one single gpu chip.
 
Rage Fury MAXX
 
nobody cares about a dual VEGA? i find it very interesting.
300 TDP, liquid cooled. ~20 Tf, because of the lower clocks..


Dual gpu tho...
 
hopefully they find a way to get it to read as one single gpu chip.
Actually they did find a way. It is mainly because the production of smaller cores on a wafer is more profitable. There are always some defects that come with the silicon and cores are being either cut down or thrown away(this applies to the CPU area as those defected 1800x for example come out as 1500x or 1600x same goes with intel BTW). Smaller means more cores 100% working and there is a way, which AMD already mentioned, that 2 dies can be seen as a single one. I need to find a video about it and I shall put it here. It does explain some stuff and from the company perspective that would be great if they could minimize loss of defected cores on the silicon delivering very powerful GPU's
 
how long was the 295X2 the fastest card around? i bet it's still a beast!

I actually tiny bit regret not going with R9 295X2. I could get a new R9 295X2 for the price of also new GTX 980. Even when CrossX wouldn't even work, it would still be basically GTX 980 level, but with fully working one, it's easily GTX 1070 and beyond level. I think I didn't go with it back then because I had miniATX case and additional AiO was a problem (already had one for CPU).

I think Dual Vega will basically be an answer to upcoming Volta and a tool to beat current NVIDIA's Titan Xp. If what you guys say of showing multiGPU to games as single GPU, if that is true and they have the means to somehow achieve that, that's pretty awesome.
 
how long was the 295X2 the fastest card around? i bet it's still a beast!

Aga sure... ask @RCoon

A beast... when it works. Ideal for 4K gaming, it could push the frames needed for a smooth 60FPS experience on damn near any game (no AA enabled). Unfortunately dual GPUs aren't getting love from developers anymore, so XFire profiles are dying out fast. I spent the last year of my time with the 295x2 with one chip fully loaded and the second taking on the role of a $500 paperweight. There was little overclocking room (and I had my fully water cooled), and it really did pump out A LOT of heat into the room. I genuinely never had my heating turned out throughout the year in that room, I just used my PC a little more in winter.

It was a blast while it lasted, but it's really not a "sensible" option for pushing frames - as you can now tell from the fact I sold it on and went for a 1050ti.

Dual GPU cards are dead imo, they get very little support, and hinder themselves due to design constraints. At this stage I'd not only not recommend a single card dual gpu option, I just straight up wouldn't recommend bothering with SLI or Crossfire anymore. Too little support.
 
Until the gaming software environment is rewritten to be dual GPU friendly with good scaling and no stutter or latency issues, dual GPU is the lazy way to say you have the fastest card. Minimum fps means alot and dual GPU's from both camps always had that issue. Sure, if a game works well it's great but in most cases they don't work well enough.

Always buy the fastest single gpu you can afford for trouble free gaming.

For software development where the latency might be less of an issue (VR - like Radeon Pro), sure it's viable.
 
What GPU is this?
It doesn't appear to be liquid cooled, as it has provision for two 4 pin PWM fans, and it also has a dual BIOS switch, the chips are marked E.S. too.
 
What GPU is this?
It doesn't appear to be liquid cooled, as it has provision for two 4 pin PWM fans, and it also has a dual BIOS switch, the chips are marked E.S. too.
Radeon pro duo maybe
 
A beast... when it works. Ideal for 4K gaming, it could push the frames needed for a smooth 60FPS experience on damn near any game (no AA enabled). Unfortunately dual GPUs aren't getting love from developers anymore, so XFire profiles are dying out fast. I spent the last year of my time with the 295x2 with one chip fully loaded and the second taking on the role of a $500 paperweight. There was little overclocking room (and I had my fully water cooled), and it really did pump out A LOT of heat into the room. I genuinely never had my heating turned out throughout the year in that room, I just used my PC a little more in winter.

It was a blast while it lasted, but it's really not a "sensible" option for pushing frames - as you can now tell from the fact I sold it on and went for a 1050ti.

Dual GPU cards are dead imo, they get very little support, and hinder themselves due to design constraints. At this stage I'd not only not recommend a single card dual gpu option, I just straight up wouldn't recommend bothering with SLI or Crossfire anymore. Too little support.
Sort of the dilemma I'm facing...my heart is telling me to get that 1000w Seasonic Titanium just in case I Xfire and get that AMD HEDT, but my head is telling me get the 850w model instead and be done with.

Edit: I'm running a 290x+290 xfire set up atm...with the games that scale, its so nice(Crysis 3, Tomb Raider) while others (Gears 4, Far Cry) it sucks. Although there is support for gears, I don't have any scaling whatsoever. Granted it maybe due to the fact of combo I have running but it's a no go.
 
Last edited:
hopefully this isnt the 1080ti equivalent that everyone is buzzing about...
 
hopefully this isnt the 1080ti equivalent that everyone is buzzing about...
Being somewhat of an optimist, I'm gonna go out on a limb and say nope. I think this is the Volta tide over.
 
Being somewhat of an optimist, I'm gonna go out on a limb and say nope. I think this is the Volta tide over.

It was part of their slide presentation

AMD-VEGA-10-specifications.jpg


Volta is 12nm, the tie over would be Vega 20 on 7nm.

AMD-VEGA-20-specifications.jpg
 
If this will be the competitor for the 1080Ti then that would be just sad...I really want AMD to succeed but I don't think dual GPU's is the way in 2017 and it probably won't be in the future either
 
@Xzibit I said somewhat lol.
@uuuaaaaaa We need it now.
@Liviu Cojocaru I don't know about that future part. It's only a matter of time before GPUs reach that wall that we are starting to see in the CPU realm.
 
Until the gaming software environment is rewritten to be dual GPU friendly with good scaling and no stutter or latency issues, dual GPU is the lazy way to say you have the fastest card. Minimum fps means alot and dual GPU's from both camps always had that issue. Sure, if a game works well it's great but in most cases they don't work well enough.

I was kinda stoked about AMDs APU+GPU scheme. For low end systems it would be really nice of you could buy a cheap GPU and boost your 30-40FPS to 45-60FPs, or allow you to scale up the settings/resolution a few notches. But alas it is not to be, and it fell on the fact that the good APUs cost as much as a midrange i3/almost i5, and that the CPU parts were Bulldozer derivatives.

EDIT: Thinking back it was a tech out of sync with time. Integrated graphics needs fast memory, and that is hard to pull off on a budget. I am dying to see what the Zen APUs will be, and it would be so swell if you could pair them with the RX550.
 
@Xzibit I said somewhat lol.
@uuuaaaaaa We need it now.
@Liviu Cojocaru I don't know about that future part. It's only a matter of time before GPUs reach that wall that we are starting to see in the CPU realm.

Since this new dual vega still uses a PLX bridge chip, I doubt it... With this said, it is known that vega supports the infinity fabric thing, so if is used in conjunction with their Ryzen platforms it may be possible, but this is speculation at this point...
 
Back
Top