Wednesday, July 12th 2017

Here Be AMD RX Vega Model's Codenames: Vega XTX, Vega XT, Vega XL

Videocardz is running a story where some of their sources have seemingly confirmed the Radeon RX Vega model's codenames according to the particular GPU that's being run, with some juicy extra tidbits for your consumption pleasure. Naturally, as Videocardz themselves put it, codenames be codenames, and are always subject to change.

However, what is arguably more interesting is the supposed segregation between models. Apparently, the RX Vega XTX is the same GPU that ticks inside AMD's Vega Frontier Edition, only with a reference water cooling solution attached to it. They report that the board should pull in 375 W of power, with the GPU pulling in 300 W of those. The Vega XT will reportedly be a more mundane air-cooled version of the graphics card, as are the until-now launched Frontier Edition versions of it (with a reduced 285 W board power, with the ASIC now pulling 220 of those watts.) The most interesting point, though, is the Vega XL. Videocardz is reporting that this will be a cut-down version of the Vega XTX and Vega XT's 4096 Stream Processors, down to 3584 Stream Processors, and that it will be sold exclusively in custom variants designed by AMD's AIB partners. Board power and ASIC power are the same as the Vega XT version, though, which seems strange, considering the not insignificant cut down in graphics processing resources. It is unclear as of yet the amount of HBM 2 memory the AIB-exclusive Vega XL will carry, but the Vega XTX and Vega XT should both deliver 8 GB of it.

Source: Videocardz
Add your own comment

95 Comments on Here Be AMD RX Vega Model's Codenames: Vega XTX, Vega XT, Vega XL

#2
sweet
Prima.Vera said:
375W ?? Jeeeezus!!!
"No barriers. No compromises."

So no whining lol
Posted on Reply
#3
cyneater
Meh who cares mining wankers and and will artificially keep the prices high
Posted on Reply
#4
noname00
375W is what two GTX1080 will pull in SLI mode

Edit: On the other hand, I won't mind that if the performance and price are significantly better than what nVidia can offer.
Posted on Reply
#5
nemesis.ie
cyneater said:
Meh who cares mining wankers and and will artificially keep the prices high
Don't blame the miners, it's the retailers price gouging that is keeping the prices high. You'd have exactly the same scenario if they were selling cards that were 2x the speed of a 1080ti for $200. They would sell out and the retailers would jack up up the price ...

IMO, if they want to be "fair" to the gaming/other non-mining market the OEMs should maybe introduce a price cap, e.g. MSRP or refuse to supply sellers if they do this. But then this is capitalism in its purest form - the price reflects what the buyer is willing to pay.

If there was a cap they would likely sell even more to miners and thus if production can keep up the OEM makes more money and the reseller makes the same if they selle.g. 2x (as some of the gouged prices are).

That does assume manufacturing can keep up.

Anyway, TL;DR, it's not the miners' fault really.
Posted on Reply
#7
uuuaaaaaa
RX Vega XTX, that is a sexy name! Reminds of the good old ATI X1950 XTX gpus! Currently VEGA FE gets power throttled at stock 1600MHz even with +50% power limit (which equates to 375W). This means that the XTX will be quite a bit faster (and power hungry) than the stock FE just from the raw numbers.
@buildzoid has run some tests regarding gpu clock / power consumption and performance scaling on Vega FE.

Edit: If the logic of naming scheme of old still holds, the XL version should have 16GB of HBM2, despite the lower shader count.
Posted on Reply
#8
nemesis.ie
... and that with a "golden" chip apparently, running 80mV less than other cards.
Posted on Reply
#9
biffzinker
220 watts for the downclocked Vega XT GPU doesn't seem that bad to me if the performance matches or exceeds the power consumed.
Posted on Reply
#10
Lox
every day a bit more rumors. Hopefully in a week or two we'll have actual facts.
Posted on Reply
#11
Recus
RejZoR said:
Still, single GTX 1080Ti can reach those numbers:
http://www.tomshardware.com/reviews/aorus-gtx-1080-ti-xtreme-edition,5054-4.html
This again. First you blaming Nvidia for not allowing AIBs to go beyond reference TDP, now it's bad that they allowed it.

Reference RX VEGA XTX is 375W so imagine what MSI RX VEGA XTX Lighting would have. Of course it won't happen because it's all Fury all over again. Only one SKU for custom models.
Posted on Reply
#14
RejZoR
Anymal said:
We all know what means 250w and 375w tdp of reference boards vs. custom beasts and peak power usage, except you I guess. Why so AMD fanboy?
Lol, people still throwing "omg you're such AMD fanboy" at me. I guess, because I'm such a huge AMD fanboy, I bought GTX 1050Ti by accident as a temporal replacement for GTX 980 instead of RX560... LOOOOOOOOOOOOOOOL. But sure, keep on trying. Now if you all excuse me I must prepare for midday prayers to my prophet Raja.
Posted on Reply
#15
Manu_PT
Rejzor is a lost case. He is on every AMD/nvidia/intel related topic, defending amd always, doesnt matter the situation. Usual business.
Posted on Reply
#16
Vayra86
Manu_PT said:
Rejzor is a lost case. He is on every AMD/nvidia/intel related topic, defending amd always, doesnt matter the situation. Usual business.
Far as I could see, all he pointed out was that a GTX 1080ti on max OC will pull over 300W. Which surely is possible. Then people went full monkey on him for some reason.

If you really want to try and be an adult, just drop the fanboy argument altogether. All of you. Nubs.
Posted on Reply
#18
RejZoR
Vayra86 said:
Far as I could see, all he pointed out was that a GTX 1080ti on max OC will pull over 300W. Which surely is possible. Then people went full monkey on him for some reason.

If you really want to try and be an adult, just drop the fanboy argument altogether. All of you. Nubs.
Lol, because I pointed out that GTX 1080Ti is also capable of pulling 300W+ power, I'm all of a sudden defending AMD. Somehow. You just can't have a reasonable dialog with people whose first argument is "omg you're an X fanboy". It's especially funny since all of the whiners seem to have GeForce cards in their systems. Attacking another GeForce owner for being an "AMD fanboy". You can't make this shit up even if you try XD
Posted on Reply
#19
rainzor
Dimi said:
Same card on TPU consumes 255W average and 267W MAX peak:
IT consumes 255W in Tom's review as well, 340W is for overclocked card. Maybe an idea for TPU, start including card's power consumption at max OC in your reviews?
Posted on Reply
#20
uuuaaaaaa
Recus said:
This again. First you blaming Nvidia for not allowing AIBs to go beyond reference TDP, now it's bad that they allowed it.

Reference RX VEGA XTX is 375W so imagine what MSI RX VEGA XTX Lighting would have. Of course it won't happen because it's all Fury all over again. Only one SKU for custom models.
According to @buildzoid Vega FE has an amazing pcb that rivals the R9 290x lightning PCB in terms of power delivery and capability. If the reference high end vega xtx is like that, I don't think that it will be an issue really (+dual bios switch)
Posted on Reply
#23
okidna
Fluffmeister said:
XFX RX Vega XTX.... yes!
It would be even crazier if they use the XXX naming scheme, XFX RX VEGA XTX GTS XXX Edition 8/16 GB :D
Posted on Reply
#24
P4-630
The Way It's Meant to be Played
okidna said:
It would be even crazier if they use the XXX naming scheme, XFX RX VEGA XTX GTS XXX Edition 8/16 GB :D
Yeah I had such a p0rn version before in the past, a XFX 6800 GTS XXX I believe it was....:D:D
Posted on Reply
#25
DeathtoGnomes
Vayra86 said:
Far as I could see, all he pointed out was that a GTX 1080ti on max OC will pull over 300W. Which surely is possible. Then people went full monkey on him for some reason.

If you really want to try and be an adult, just drop the fanboy argument altogether. All of you. Nubs.
Thief, thats my line!! No surprise that Intel fanbabies solo out one guy to attack every thread.


RejZoR said:
Lol, because I pointed out that GTX 1080Ti is also capable of pulling 300W+ power, I'm all of a sudden defending AMD. Somehow. You just can't have a reasonable dialog with people whose first argument is "omg you're an X fanboy". It's especially funny since all of the whiners seem to have GeForce cards in their systems. Attacking another GeForce owner for being an "AMD fanboy". You can't make this shit up even if you try XD
You are right we cant make this up, but you can it seems.
Posted on Reply
Add your own comment