Monday, May 15th 2017

Entire AMD Vega Lineup Reportedly Leaked - Available on June 5th?

Reports are doing the rounds regarding alleged AMD insiders having "blown the whistle", so to speak, on the company's upcoming Vega graphics cards. This leak also points towards retail availability of Vega cards on the 5th of June, which lines up nicely with AMD's May 31st Computex press conference. An announcement there, followed by market availability on the beginning of next week does sound like something that would happen in a new product launch.

On to the meat and bones of this story, three different SKUs have been leaked, of which no details are currently known, apart from their naming and pricing. AMD's Vega line-up starts off with the RX Vega Core graphics card, which is reportedly going to retail for $399. This graphics card is going to sell at a higher price than NVIDIA's GTX 1070, which should mean higher performance. Higher pricing with competitive performance really wouldn't stir any pot of excitement, so, higher performance is the most logical guess. The $399 pricing sits nicely in regards to AMD's RX 580, though it does mean there is space for another SKU to be thrown into the mix at a later date, perhaps at $329, though I'm just speculating on AMD's apparent pricing gap at this point.
Next up is the RX Vega Eclipse, which will reportedly retail for $499, going head to head with NVIDIA's GTX 1080 (in fact, slightly cheaper than the majority of AIB versions of the card). The line between the Core and the Eclipse is a little blurry here, since we know that the GTX 1070's performance can easily be overclocked to reach a stock 1080 - their performance delta isn't that great. If the RX Vega Core does bring with it higher performance than the GTX 1070 (justified by its higher pricing), then that would place it close to GTX 1080 (stock) performance. Since AMD would be trying to avoid its RX Vega Core from eclipsing (eh) its RX Vega Eclipse graphics card in the price/performance department, one can expect - with reservations - that the performance delta between the Core and the Eclipse is higher than their respective pricing indicates. So I would expect the RX Vega Eclipse to offer performance that's greater than the GTX 1080's.

Finally, we have the crème de la crème of Vega, the RX Vega Nova. This graphics card is reported to retail for $599, a full $100 cheaper than NVIDIA's GTX 1080 Ti, while looking to directly compete with it. Considering this pricing, and admitting that the leak pans out correctly, this would mean we won't be seeing a Vega card that's capable of competing with NVIDIA's Titan Xp graphics card (at least, not a single-GPU solution...) AMD simply would not sell their top of the line Vega for $599 if it was competitive with that NVIDIA titan of a graphics card. Based on AMD's previous pricing strategy, I would expect the company to deliver roughly the same performance as the GTX 1080 Ti, looking to use its Nova not as a pure performance product, but as a price/performance contender. What do you make of this leak?
Source: Digiworthy
Add your own comment

87 Comments on Entire AMD Vega Lineup Reportedly Leaked - Available on June 5th?

#51
AngryGoldfish
I've seen Polaris with dropped voltages consume very little power. I don't know whether it would be on
Raevenlord

From w1zzard's review, here
That is the ONLY time I have EVER seen that happen. Like, ever. I'd definitely need to see it repeated as it goes against all benchmarks I've seen elsewhere. Thank you for posting though.

Edit: Actually, looking at other TPU reviews they have been able to repeat it with many of their 1070's. I'll have to scour the rest of the web to see if anyone else can mimic their results 'cause like I said it goes against everything I've seen over the last year.
Posted on Reply
#52
Fouquin
W1zzardI haven't heard anything from AMD other than the Computex press event invite.

No Vega NDA, no info, nothing on reviews.
Give them a poke, check that you're on a list somwhere. We got word that samples were planned to ship but not a time frame of when that would be.
Posted on Reply
#53
RejZoR
cdawallNot quite the same. Like at all.
That's the exact same level of idiocy comparing GCN from even Polaris to Vega.

Besides, GCN is not a physical thing, you can't look at GPU schematic and say, see, that's GCN! GCN is a broad name for many things that form up a rendering pipeline (rasterizer, vertex unit, async unit, etc). In Vega, this broad name for many things combined is named NCU. Guess what, because it was changed so much it got a new name. Just like Fermi CUDA cores aren't the same as those on Pascal, so aren't the same those on Vega compared to older designs.
Posted on Reply
#54
Captain_Tom
P4-630I don't expect it to be even close to 150 Watts, for sure not less in power draw than a GTX1070 knowing AMD's current available cards....
This isn't AMD'S current cards lol.
Posted on Reply
#55
Basard
The 1080 Ti competes with the Titan XP, so if the high end vega competes with the Ti, then it will compete with Titan.

Speaking of 16 gigs.... I dont see why anybody needs more than 640k.
Posted on Reply
#56
cdawall
where the hell are my stars
RejZoRThat's the exact same level of idiocy comparing GCN from even Polaris to Vega.

Besides, GCN is not a physical thing, you can't look at GPU schematic and say, see, that's GCN! GCN is a broad name for many things that form up a rendering pipeline (rasterizer, vertex unit, async unit, etc). In Vega, this broad name for many things combined is named NCU. Guess what, because it was changed so much it got a new name. Just like Fermi CUDA cores aren't the same as those on Pascal, so aren't the same those on Vega compared to older designs.
It is the width thing. That is why they are pulling as much power as they do. Pushing them past their efficiency curve isn't helping either.
Posted on Reply
#57
GhostRyder
Huh, well that is an interesting naming scheme if it turns out to be true...

Well it does give me a little bit of interest in seeing these come to light if they are the new cards.
Posted on Reply
#58
P4-630
Captain_TomThis isn't AMD'S current cards lol.
What part don't you understand, I said:
"I don't expect it to be even close to 150 Watts, for sure not less in power draw than a GTX1070"

AMD's current cards just draw more power than Nvidia's.
Posted on Reply
#59
TheGuruStud
AngryGoldfishI've seen Polaris with dropped voltages consume very little power. I don't know whether it would be on


That is the ONLY time I have EVER seen that happen. Like, ever. I'd definitely need to see it repeated as it goes against all benchmarks I've seen elsewhere. Thank you for posting though.

Edit: Actually, looking at other TPU reviews they have been able to repeat it with many of their 1070's. I'll have to scour the rest of the web to see if anyone else can mimic their results 'cause like I said it goes against everything I've seen over the last year.
1080 isn't that fast. My 980ti is only about 11% off OC to OC at 1440.
Posted on Reply
#60
64K
Captain_TomThe R9 Nano was more efficient than everything Nvidia had until the GTX 1080 finally came out.
Not sure where you're getting that. According to the review here

Posted on Reply
#61
medi01
Raevenlord

From w1zzard's review, here
That's 12% gain from OC of an alraedy OCed card.
But gap is bigger than 12%:

Posted on Reply
#62
RejZoR
BasardThe 1080 Ti competes with the Titan XP, so if the high end vega competes with the Ti, then it will compete with Titan.

Speaking of 16 gigs.... I dont see why anybody needs more than 640k.
16GB just of HBC memory. It can address far more than that. Meaning, when it'll reach actual 16GB needed for game and HBC won't be able to utilize any more of actual memory, the framerate drop off will be dramatically lower than when other cards reach their VRAM limit.

Resident Evil 7 was one of such examples if you used the shadow cache which pushed VRAM usage past 4GB and caused GTX 980 to struggle horrendously (even though without that thing, at same visual quality runs the game at like 100+ fps). That's the beauty of HBC system. Memory capacity becomes limitless basically. Sure, there is penalty since speed of external sources doesn't match HBM2, but it's better to have a penalty of few frames than 3/4 less framerate.
Posted on Reply
#63
the54thvoid
Intoxicated Moderator
This thread's a mess....
Posted on Reply
#64
deemon
P4-630Is everyone gaming at 4K tomorrow? :)

I don't think I'll fill up all my 8GB so soon @1440p.
Because you can simply add another ram module to your GPU if the need arises. ... not.
Posted on Reply
#65
danbert2000
AngryGoldfishPlease show me a single 1070 that's overclocked to match a stock 1080. The 970 can be overclocked to match a 980, but it's common knowledge that the 1080 is leagues above the 1070. Overclocking it closes the gap, but it's still behind by a calculated amount.
My thoughts exactly, the author must not read Techpowerup's own reviews.
Posted on Reply
#66
danbert2000
TheGuruStud1080 isn't that fast. My 980ti is only about 11% off OC to OC at 1440.
Whatever makes you feel better...
Posted on Reply
#67
Captain_Tom
64KNot sure where you're getting that. According to the review here
Are you not looking at the 4K benchmark? That's an average using TPU'S roster of games. A roster that includes dubious inclusions like non-working batman games and anno.

Even with the bias, it was up there with Nvidia's most efficient cards. So again - why couldn't AMD do that again?

It would be easier this time too considering Vega is brand new, and the R9 Nano used aging GCN.
Posted on Reply
#68
Fluffmeister
Paying a premium of $650 for an underclocked Fiji chip was lunacy, hence why they sold like Mein Kampf in Israel.
Posted on Reply
#69
ZoneDymo
FluffmeisterPaying a premium of $650 for an underclocked Fiji chip was lunacy, hence why they sold like Mein Kampf in Israel.
Man, that analogy was so edgy, I nearly cut myself....
Posted on Reply
#70
Fluffmeister
ZoneDymoMan, that analogy was so edgy, I nearly cut myself....
I knew you'd be upset, but I'll try harder if it finally pushes you off the edge.
Posted on Reply
#71
Super XP
oxidizedTo wait or not to wait...that is the question :wtf:
Wait. This looks great.
Go AMD Go
Posted on Reply
#72
ratirt
I'm waiting. Not gonna bitch about AMD or NVIDIA to get any attention. I know what I know and I'm comfortable with it. I'm waiting cause I know it is worth to wait.
Posted on Reply
#73
medi01
the54thvoidThis thread's a mess....
The "reportedly leaked" thing is a mess to begin with.
Posted on Reply
#75
Caring1
Can someone explain to little ol me, if HBM2 is meant to be much better than GDDR5, why it even needs 16Gb?
Surely 8Gb HBM2 would be enough.
Posted on Reply
Add your own comment
Apr 24th, 2024 01:22 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts