• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Here Be AMD RX Vega Model's Codenames: Vega XTX, Vega XT, Vega XL

Far as I could see, all he pointed out was that a GTX 1080ti on max OC will pull over 300W. Which surely is possible. Then people went full monkey on him for some reason.

If you really want to try and be an adult, just drop the fanboy argument altogether. All of you. Nubs.

Thief, thats my line!! No surprise that Intel fanbabies solo out one guy to attack every thread.


Lol, because I pointed out that GTX 1080Ti is also capable of pulling 300W+ power, I'm all of a sudden defending AMD. Somehow. You just can't have a reasonable dialog with people whose first argument is "omg you're an X fanboy". It's especially funny since all of the whiners seem to have GeForce cards in their systems. Attacking another GeForce owner for being an "AMD fanboy". You can't make this shit up even if you try XD

You are right we cant make this up, but you can it seems.
 
300 w for watercooled + 75 for pump and fan i guess
 
300 w for watercooled + 75 for pump and fan i guess

Unlikely, unless they plan on running a Delta fan at 15.000 RPM. With a gardening high pressure pump :D
 
Rejzor is a lost case. He is on every AMD/nvidia/intel related topic, defending amd always, doesnt matter the situation. Usual business.

No ones perfect but all he pointed out was a custom GTX 1080ti's power consumption then you & Anymal automatically label him an AMD fanboy lol, it's sad.
 
Here be?? Is TPU run by pirates?? :p

overclocked to the max, while these are 300/375w stock...how was that point lost?

Well, maybe AMD will be delivering "overclocked to the max" Vega's to begin with? Sure, there won't be any OC headroom, but who cares if they deliver max from the get go? I'd be fine with that if it means same performance as maxed out GTX 1080Ti...
 
Who knows. The fact of the matter is, right now, they are, STOCK 300 and 375W parts. Get into overclocking and you are looking at 325+ and 400+ which is being conservative. You can't get there (400W+) on a 1080Ti without hard mods, and likely sub ambient cooling.
 
Rather High W there for vega. I'm hoping the perf is up there too.
 
Last edited:
And what's weird about that? Who says AMD can't push RX Vega really out of the box which would contribute to such consumption comparable to really hard pushed 1080Ti ? Whether you look at lack of later OC worrying or the fact you get max by default, that's hard to say at this point. It really depends where its performance will be. If they push it this hard and not even reach stock 1080Ti, then that is indeed a bit of a problem. Unless they offset it with a hefty price cut. Then, many people would look away and grab it anyway...
 
And what's weird about that? Who says AMD can't push RX Vega really out of the box which would contribute to such consumption comparable to really hard pushed 1080Ti ? Whether you look at lack of later OC worrying or the fact you get max by default, that's hard to say at this point. It really depends where its performance will be. If they push it this hard and not even reach stock 1080Ti, then that is indeed a bit of a problem. Unless they offset it with a hefty price cut. Then, many people would look away and grab it anyway...

With HBM2 and the current prices of GPU. I'm pessimistic about the prices
 
And what's weird about that? Who says AMD can't push RX Vega really out of the box which would contribute to such consumption comparable to really hard pushed 1080Ti ? Whether you look at lack of later OC worrying or the fact you get max by default, that's hard to say at this point. It really depends where its performance will be. If they push it this hard and not even reach stock 1080Ti, then that is indeed a bit of a problem. Unless they offset it with a hefty price cut. Then, many people would look away and grab it anyway...
Weird? Nothing. I didn't make the point of putting up a fully maxed out 1080Ti against a stock 300/375W card here bud. I was just pointing out the other side of that thought process to make a complete story for those reading. ;)
 
With HBM2 and the current prices of GPU. I'm pessimistic about the prices

Etherum price started to steadily fall, and the bubble might burst soon. That's good for gamers, bad for AMD.
Right now you can find a few GTX1080s for around $540 - $570 on newegg, and that's not too bad

I wonder how much lower can nVidia sell a 1080ti and still make a profit
 
Lol, because I pointed out that GTX 1080Ti is also capable of pulling 300W+ power, I'm all of a sudden defending AMD. Somehow. You just can't have a reasonable dialog with people whose first argument is "omg you're an X fanboy". It's especially funny since all of the whiners seem to have GeForce cards in their systems. Attacking another GeForce owner for being an "AMD fanboy". You can't make this shit up even if you try XD
@RejZoR , you should know by now that if you are not praising Nvidia and bashing AMD you are automatically a fanboy!

As far as the power consumption goes, its high but does not bother me as long as price and performance live up to it...
 
Don't blame the miners, it's the retailers price gouging that is keeping the prices high. You'd have exactly the same scenario if they were selling cards that were 2x the speed of a 1080ti for $200. They would sell out and the retailers would jack up up the price ...

IMO, if they want to be "fair" to the gaming/other non-mining market the OEMs should maybe introduce a price cap, e.g. MSRP or refuse to supply sellers if they do this. But then this is capitalism in its purest form - the price reflects what the buyer is willing to pay.

If there was a cap they would likely sell even more to miners and thus if production can keep up the OEM makes more money and the reseller makes the same if they selle.g. 2x (as some of the gouged prices are).

That does assume manufacturing can keep up.

Anyway, TL;DR, it's not the miners' fault really.

It's called supply and demand. When supply goes down due to everyone buying them, you have to raise prices to drive down demand cause you can't keep them in stock.

So it IS the miners fault ultimately.
 
I dont mind the 375W pulling power as long as it delivers the performance. I still have my ax 1500i here waiting :)
 
Weird? Nothing. I didn't make the point of putting up a fully maxed out 1080Ti against a stock 300/375W card here bud. I was just pointing out the other side of that thought process to make a complete story for those reading. ;)

Yeah, well, what if 375W for Vega is already maxed out by AMD instead of having reserve like it was in GTX 1080Ti's case? That's gonna suck for aftermarket makers who'll only be able to work with cooler designs and that would be it if there is no headroom left. Or they'll make a really beefy one with massive cooler and 400W power draw. I'm sure there would be someone who'd make that XD
 
Power consumption is truly important if being ran on batteries (laptops)
 
In 2017, TDP doesn't matter any more. But it did matter back in 2010, when the shoe where on the other foot.
 
Yeah, well, what if 375W for Vega is already maxed out by AMD instead of having reserve like it was in GTX 1080Ti's case? That's gonna suck for aftermarket makers who'll only be able to work with cooler designs and that would be it if there is no headroom left. Or they'll make a really beefy one with massive cooler and 400W power draw. I'm sure there would be someone who'd make that XD
Lots of what ifs...

Has any AMD card yet been absolutely maxed out? You can overclock Fury some, these low end cards some... all use more power in doing so, even though they can barely overclock. I was conservative when I said 325W+/400W+. If you really want to know what I am thinking, I will have to imagine with overclocking you are looking at 350+ and 425+. I don't believe my guess is out of line.

While power use really doesn't matter, getting rid of all that heat does. Noise and such in doing so as well...
 
It still could be that this TDP of 375W is when it's flat out doing compute. If it's doing gaming with TBR it may be less. We won't know until we see the release candidate drivers (assuming they enable that and the other stuff that might be off).

This is no more/less valid speculation than some of the other stuff being thrown around here.

@Gasaraki: regarding supply/demand, no, the price does not need to be raised, it could stay the same and folks just need to queue and wait for production. It's pure greed that is increasing the price ...

Increasing the price to reduce demand is an artificial thing and just means that richer people (those who can afford the increase) get their product quicker and the retailer, not the maker, gets more profit.
 
Last edited:
No ones perfect but all he pointed out was a custom GTX 1080ti's power consumption then you & Anymal automatically label him an AMD fanboy lol, it's sad.
Custom 1080ti he pointed out during gaming and torture tests pulls 260w, as seen on his link. Tdp 350w is completely different story, so he must be AMD lover.
 
That said, the TDP just means the system can cope with that much draw, it doesn't mean it will use it in every workload ...
 
Custom 1080ti he pointed out during gaming and torture tests pulls 260w, as seen on his link. Tdp 350w is completely different story, so he must be AMD lover.

You're still going with the "AMD fanboy" narrative... Shit, I'll cancel all the comedy channels on IPTV and just visit TPU instead... Best shit ever XD I mean, calling someone who hasn't had AMD CPU since AthlonXP era an "AMD lover" and "AMD fanboy" is a thick statement. Fanboy... bwahahahaha XD Sorry, but this is so hilarious. It just is. XD
 
If the logic of naming scheme of old still holds, the XL version should have 16GB of HBM2, despite the lower shader count.

You mind expanding on that? If you can that is.. I mean i get what you're saying, i'm just missing something central i think, 'cause i can't understand why it would feature 16GB, whereas the XTX (flagship) should only have 8?

Hope you reply :)
 
Back
Top