Wednesday, July 12th 2017
Here Be AMD RX Vega Model's Codenames: Vega XTX, Vega XT, Vega XL
Videocardz is running a story where some of their sources have seemingly confirmed the Radeon RX Vega model's codenames according to the particular GPU that's being run, with some juicy extra tidbits for your consumption pleasure. Naturally, as Videocardz themselves put it, codenames be codenames, and are always subject to change.
However, what is arguably more interesting is the supposed segregation between models. Apparently, the RX Vega XTX is the same GPU that ticks inside AMD's Vega Frontier Edition, only with a reference water cooling solution attached to it. They report that the board should pull in 375 W of power, with the GPU pulling in 300 W of those. The Vega XT will reportedly be a more mundane air-cooled version of the graphics card, as are the until-now launched Frontier Edition versions of it (with a reduced 285 W board power, with the ASIC now pulling 220 of those watts.) The most interesting point, though, is the Vega XL. Videocardz is reporting that this will be a cut-down version of the Vega XTX and Vega XT's 4096 Stream Processors, down to 3584 Stream Processors, and that it will be sold exclusively in custom variants designed by AMD's AIB partners. Board power and ASIC power are the same as the Vega XT version, though, which seems strange, considering the not insignificant cut down in graphics processing resources. It is unclear as of yet the amount of HBM 2 memory the AIB-exclusive Vega XL will carry, but the Vega XTX and Vega XT should both deliver 8 GB of it.
Source:
Videocardz
However, what is arguably more interesting is the supposed segregation between models. Apparently, the RX Vega XTX is the same GPU that ticks inside AMD's Vega Frontier Edition, only with a reference water cooling solution attached to it. They report that the board should pull in 375 W of power, with the GPU pulling in 300 W of those. The Vega XT will reportedly be a more mundane air-cooled version of the graphics card, as are the until-now launched Frontier Edition versions of it (with a reduced 285 W board power, with the ASIC now pulling 220 of those watts.) The most interesting point, though, is the Vega XL. Videocardz is reporting that this will be a cut-down version of the Vega XTX and Vega XT's 4096 Stream Processors, down to 3584 Stream Processors, and that it will be sold exclusively in custom variants designed by AMD's AIB partners. Board power and ASIC power are the same as the Vega XT version, though, which seems strange, considering the not insignificant cut down in graphics processing resources. It is unclear as of yet the amount of HBM 2 memory the AIB-exclusive Vega XL will carry, but the Vega XTX and Vega XT should both deliver 8 GB of it.
95 Comments on Here Be AMD RX Vega Model's Codenames: Vega XTX, Vega XT, Vega XL
AMD also has to have a GCN5 GPU with a GDDR5X or GDDR6 memory controller in the works. I wonder when that will debut.
The reason why they usually aren't pushed to the limit is the consumption and longevity. I've seen it during HD7950 overclocking. For a bump from 900MHz to 1GHz required nearly no voltage increase. For 1.1GHz it required 0.05V more. For 1.2GHz it required 0.15V+ more. And that's what they are balancing. The sweet spot. Once you're requiring high voltages for small gains, they simply call it a day. But then there are people who like to push things to the max no matter what...
EDIT:
Why people keep on calling it GCN5 even though AMD themselves call the units NCU's?
Wait, let me dilute that a bit........ :)
It is kind of high, at stock, don't you think? Though one may have existed, I don't recall a single GPU STARTING at 300W/375W board power. Scaling be damned, that is one hell of a difference, about 100W or 40% give or take several (didn't do the math).
Also somehow I feel like it is ReJzor versus everyone who doesn't agree with this hype ATM.
Vega is still GCN. The very fact that AMD was able to run it with slightly modified Fiji driver and demo it last December was telling everything.
It is late and it seems to be delayed to this Fall as well. Move on.
But that does not mean you will pass 375W of power when playing games for example. There's various tricks and software features that caps the limit of the power usage.
The 9570 had a TDP of 220W as well but could be shaved off like 60watts from the wall by simply undervolting it since AMD is not finetuning their cards or CPU's to the limit.
They rather go for a safe approach with a slight higher power consumption. I think the AVG will be on 250 up to 275W. Not 375.
@Jism - As far as the worst case. Sure, it is, but it CAN get there, just as how on the NVIDIA side it can too. That is why its listed there and not lower. Undervolting is also an option...as it can be for NVIDIA cards too. We need to take this at face value until we see more testing.
My guess is the average on a 375w card is well over 300W... they dont overestimate by 50%...
I just know comparing a maxed out 1080ti to stock board power wasn't remotely a good idea. :p
Still a good buy if the price/perf is as expected from an AMD though (if the miners don't jack up the price).
That's like saying I won't buy a Ferrari because it only gets 6 mpg while the Lamborghini gets 7 mpg...
In the end the people who will buy it don't give a fuck.
I find it reasonable to believe that AMD is taking a different route with its newer hardware releases. They push it to the edge themselves. In the case of Fury X > Vega it would make complete sense to do so, not only do they need the performance, but the competitor is essentially doing something similar but calls it GPU Boost 3.0, which essentially only gives you less guarantees on the box (300mhz gaps between stock and actual clocks) but has the same net result - and yes, it uses less power = less heat and therefore still offers OC headroom, but then again, look at the OC left on the 1080ti Lightning - 3% really ain't much is it...
Now that I think of it, RX580 is another great example of AMD eating up the OC headroom and marketing it themselves. Its a real trend. Look at Intel's Kaby Lake and X299 releases: increased TDP for clock bumps out of the box. Its a simple method to hide stagnation.
i.e. in the summer, lots of heat might mean the room becomes too hot to sit in.
(Post 32 and 44)
Again.. if it slightly overclocks, 325/400+ is in the cards. If it has more headroom (like i suspect), its going to be even higher.
If it has an impact I just crank the aircon up and it's done. Besides I think I will go with water cooling solution anyway.