Wednesday, July 12th 2017

Here Be AMD RX Vega Model's Codenames: Vega XTX, Vega XT, Vega XL

Videocardz is running a story where some of their sources have seemingly confirmed the Radeon RX Vega model's codenames according to the particular GPU that's being run, with some juicy extra tidbits for your consumption pleasure. Naturally, as Videocardz themselves put it, codenames be codenames, and are always subject to change.

However, what is arguably more interesting is the supposed segregation between models. Apparently, the RX Vega XTX is the same GPU that ticks inside AMD's Vega Frontier Edition, only with a reference water cooling solution attached to it. They report that the board should pull in 375 W of power, with the GPU pulling in 300 W of those. The Vega XT will reportedly be a more mundane air-cooled version of the graphics card, as are the until-now launched Frontier Edition versions of it (with a reduced 285 W board power, with the ASIC now pulling 220 of those watts.) The most interesting point, though, is the Vega XL. Videocardz is reporting that this will be a cut-down version of the Vega XTX and Vega XT's 4096 Stream Processors, down to 3584 Stream Processors, and that it will be sold exclusively in custom variants designed by AMD's AIB partners. Board power and ASIC power are the same as the Vega XT version, though, which seems strange, considering the not insignificant cut down in graphics processing resources. It is unclear as of yet the amount of HBM 2 memory the AIB-exclusive Vega XL will carry, but the Vega XTX and Vega XT should both deliver 8 GB of it.
Source: Videocardz
Add your own comment

95 Comments on Here Be AMD RX Vega Model's Codenames: Vega XTX, Vega XT, Vega XL

#51
EarthDog
Motivation? Likely temporary insanity. :p
Posted on Reply
#52
FordGT90Concept
"I go fast!1!11!1!"
I had an X800 XL back in the day. It was the price/performance winner. Considering AMD doesn't have anything in the $300-400 range, that's a perfect home for Vega XL. Question is does it have 8 GiB or 4 GiB VRAM?

AMD also has to have a GCN5 GPU with a GDDR5X or GDDR6 memory controller in the works. I wonder when that will debut.
Posted on Reply
#53
RejZoR
AnymalOk, not a fanboy then, relax, no need to overreact. How about 1080ti max oc'd vs. Vega stock comparement? Apples and oranges. What was your motivation to post that nonsense?
Someone said Vega consumption is insane. I've posted a proof so can be Pascals. That was all there is to it. Whether one pushes it to max from factory or leaves some headroom to AIB's, that's up to the card maker.

The reason why they usually aren't pushed to the limit is the consumption and longevity. I've seen it during HD7950 overclocking. For a bump from 900MHz to 1GHz required nearly no voltage increase. For 1.1GHz it required 0.05V more. For 1.2GHz it required 0.15V+ more. And that's what they are balancing. The sweet spot. Once you're requiring high voltages for small gains, they simply call it a day. But then there are people who like to push things to the max no matter what...

EDIT:
Why people keep on calling it GCN5 even though AMD themselves call the units NCU's?
Posted on Reply
#54
FordGT90Concept
"I go fast!1!11!1!"
Graphics Core Next (GCN) is the architecture of the entire GPU including Asynchronous Compute Engine (ACE), Unified Video Decoder (UVD), Video Coding Engine (VCE), and so on. Next Compute Unit (NCU) is basically the design of each core (contains all the SIMDs).
Posted on Reply
#55
EarthDog
RejZoRSomeone said Vega consumption is insane. I've posted a proof so can be Pascals. That was all there is to it. Whether one pushes it to max from factory or leaves some headroom to AIB's, that's up to the card maker.
It is kind of insane, at stock, no?

Wait, let me dilute that a bit........ :)

It is kind of high, at stock, don't you think? Though one may have existed, I don't recall a single GPU STARTING at 300W/375W board power. Scaling be damned, that is one hell of a difference, about 100W or 40% give or take several (didn't do the math).
Posted on Reply
#56
xkm1948
With +50% power limit and slight overclock to 1150Core I have seen my FuryX edging close to 400Watt playing FO4. And this was all done purely on software level modification with stock AIO. So yeah Vega is only gonna be worse than that IMO.

Also somehow I feel like it is ReJzor versus everyone who doesn't agree with this hype ATM.

Vega is still GCN. The very fact that AMD was able to run it with slightly modified Fiji driver and demo it last December was telling everything.

It is late and it seems to be delayed to this Fall as well. Move on.
Posted on Reply
#57
Anymal
RejZoRSomeone said Vega consumption is insane. I've posted a proof so can be Pascals. That was all there is to it. Whether one pushes it to max from factory or leaves some headroom to AIB's, that's up to the card maker.
They will release first single GPU card with 375w TDP ever and the first thing you do is linking to tomshw page where extreme ed. 1080ti is consuming around 250w (stock 1080ti TDP is 250w, remember) in gaming and much more when ultra overclocked. That is proof of your...
Posted on Reply
#58
RejZoR
Dude, just stop it already.
Posted on Reply
#59
Jism
Prima.Vera375W ?? Jeeeezus!!!
Taken into account that is the worst condition the card can consume in total.

But that does not mean you will pass 375W of power when playing games for example. There's various tricks and software features that caps the limit of the power usage.

The 9570 had a TDP of 220W as well but could be shaved off like 60watts from the wall by simply undervolting it since AMD is not finetuning their cards or CPU's to the limit.

They rather go for a safe approach with a slight higher power consumption. I think the AVG will be on 250 up to 275W. Not 375.
Posted on Reply
#60
Anymal
Yes, everything is ok, probably just a typo, they meant 275w as 290x. Peace.
Posted on Reply
#61
EarthDog
Dont compare a maxed out card with a stock one... it wouldnt have started, rezj. ;)



@Jism - As far as the worst case. Sure, it is, but it CAN get there, just as how on the NVIDIA side it can too. That is why its listed there and not lower. Undervolting is also an option...as it can be for NVIDIA cards too. We need to take this at face value until we see more testing.

My guess is the average on a 375w card is well over 300W... they dont overestimate by 50%...

I just know comparing a maxed out 1080ti to stock board power wasn't remotely a good idea. :p
Posted on Reply
#62
Jism
AMD CGN / NUC will always have a disadvantage due to the brute-force approach compared to Nvidia. This is why AMD cards are being favoured by miners in general. This brute-force always comes with a slight higher power consumption. But that does'nt mean AMD cards are bad. In fact, i tossed the brand Nvidia away back in the FX5xxx series and never switched back again. It was always AMD and their cards performed very well (X800XT era).
Posted on Reply
#64
sweet
AMD loses to nVidia in power consumption, is that even a surprise?

Still a good buy if the price/perf is as expected from an AMD though (if the miners don't jack up the price).
Posted on Reply
#65
Unregistered
Why is power consumption an issue for something that is only geared towards enthusiasts???
That's like saying I won't buy a Ferrari because it only gets 6 mpg while the Lamborghini gets 7 mpg...
In the end the people who will buy it don't give a fuck.
#66
ratirt
jmcslobWhy is power consumption an issue for something that is only geared towards enthusiasts???
That's like saying I won't buy a Ferrari because it only gets 6 mpg while the Lamborghini gets 7 mpg...
In the end the people who will buy it don't give a fuck.
You are right at least in my eyes. I really don't care about the power consumption much. It's more of a bonus to the purchase. Perf/$ that's what I'm mostly after. Of course in the high end segment which I assume vega will end up in.
Posted on Reply
#67
Vayra86
EarthDogHere be?? Is TPU run by pirates?? :p

overclocked to the max, while these are 300/375w stock...how was that point lost?
Well.... if you think critically about Fury X and its real world OC potential, and how HBM was super stubborn in anything you did with the clocks... And if you also look at how Ryzen behaves nowadays in terms of how it is optimized for a rather tight max clock range...

I find it reasonable to believe that AMD is taking a different route with its newer hardware releases. They push it to the edge themselves. In the case of Fury X > Vega it would make complete sense to do so, not only do they need the performance, but the competitor is essentially doing something similar but calls it GPU Boost 3.0, which essentially only gives you less guarantees on the box (300mhz gaps between stock and actual clocks) but has the same net result - and yes, it uses less power = less heat and therefore still offers OC headroom, but then again, look at the OC left on the 1080ti Lightning - 3% really ain't much is it...

Now that I think of it, RX580 is another great example of AMD eating up the OC headroom and marketing it themselves. Its a real trend. Look at Intel's Kaby Lake and X299 releases: increased TDP for clock bumps out of the box. Its a simple method to hide stagnation.
Posted on Reply
#68
BiggieShady
Of course you have to care about power consumption ... it's not like the only consequence is the power bill ... I mean, the dissipated heat has to go somewhere
Posted on Reply
#69
ratirt
BiggieShadyOf course you have to care about power consumption ... it's not like the only consequence is the power bill ... I mean, the dissipated heat has to go somewhere
Of course it has to go somewhere. But considering my 780 TI I currently use and 670 before and 460 before I don't think that's much of a problem. I been also thinking about the water cooling solution. That all depends but believe me I don't think that's a problem with my current configuration and how it spreads the heat. If I find out that there's a problem slight modification or improvement will solve that issue.
Posted on Reply
#70
nemesis.ie
I think he means you are putting the heat into your room no matter what, unless you put the PC in another room or run your cooling to outside the room/home/office. :)

i.e. in the summer, lots of heat might mean the room becomes too hot to sit in.
Posted on Reply
#71
ratirt
nemesis.ieI think he means you are putting the heat into your room no matter what, unless you put the PC in another room or run your cooling to outside the room/home/office. :)

i.e. in the summer, lots of heat might mean the room becomes too hot to sit in.
Hmm what about aricon? does this count? sitting in a room with boiling hot temperatures is not my favorite.
Posted on Reply
#72
EarthDog
Vayra86Well.... if you think critically about Fury X and its real world OC potential, and how HBM was super stubborn in anything you did with the clocks... And if you also look at how Ryzen behaves nowadays in terms of how it is optimized for a rather tight max clock range...

I find it reasonable to believe that AMD is taking a different route with its newer hardware releases. They push it to the edge themselves. In the case of Fury X > Vega it would make complete sense to do so, not only do they need the performance, but the competitor is essentially doing something similar but calls it GPU Boost 3.0, which essentially only gives you less guarantees on the box (300mhz gaps between stock and actual clocks) but has the same net result - and yes, it uses less power = less heat and therefore still offers OC headroom, but then again, look at the OC left on the 1080ti Lightning - 3% really ain't much is it...

Now that I think of it, RX580 is another great example of AMD eating up the OC headroom and marketing it themselves. Its a real trend. Look at Intel's Kaby Lake and X299 releases: increased TDP for clock bumps out of the box. Its a simple method to hide stagnation.
yep.. already considered that, and was accounted for if you look back...i have 10 more posts after that one V. ;)
(Post 32 and 44)

Again.. if it slightly overclocks, 325/400+ is in the cards. If it has more headroom (like i suspect), its going to be even higher.
Posted on Reply
#73
BiggieShady
ratirtOf course it has to go somewhere. But considering my 780 TI I currently use and 670 before and 460 before I don't think that's much of a problem. I been also thinking about the water cooling solution. That all depends but believe me I don't think that's a problem with my current configuration and how it spreads the heat. If I find out that there's a problem slight modification or improvement will solve that issue.
nemesis.ieI think he means you are putting the heat into your room no matter what, unless you put the PC in another room or run your cooling to outside the room/home/office. :)

i.e. in the summer, lots of heat might mean the room becomes too hot to sit in.
Yeah, two PCs in the same room in the summer ... the complete causal chain ends with buying stronger air conditioning
Posted on Reply
#74
ratirt
BiggieShadyYeah, two PCs in the same room in the summer ... the complete causal chain ends with buying stronger air conditioning
I think mine does just great in that department. I got only one computer but also other stuff that generate heat. So far never had problems and I don't think the Vega card would have such an impact on my aircon environment.
If it has an impact I just crank the aircon up and it's done. Besides I think I will go with water cooling solution anyway.
Posted on Reply
#75
EarthDog
Ehh...........water or air, you are still dissipating the same amount of heat dude.
Posted on Reply
Add your own comment
Apr 18th, 2024 07:31 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts