• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Entire AMD Vega Lineup Reportedly Leaked - Available on June 5th?

I think Wccf came to TPU with this *alleged rumor.

*with today's politics & politicians it's always nice to hedge your bets i.e. never give a straight yes or no :D

I believe
they think we started this rumor. Must've missed the source link.

Taking a page from your book, R0H1T, hope you don't mind :rolleyes:
 
I've seen Polaris with dropped voltages consume very little power. I don't know whether it would be on
perf_oc.png


From w1zzard's review, here

That is the ONLY time I have EVER seen that happen. Like, ever. I'd definitely need to see it repeated as it goes against all benchmarks I've seen elsewhere. Thank you for posting though.

Edit: Actually, looking at other TPU reviews they have been able to repeat it with many of their 1070's. I'll have to scour the rest of the web to see if anyone else can mimic their results 'cause like I said it goes against everything I've seen over the last year.
 
I haven't heard anything from AMD other than the Computex press event invite.

No Vega NDA, no info, nothing on reviews.

Give them a poke, check that you're on a list somwhere. We got word that samples were planned to ship but not a time frame of when that would be.
 
Not quite the same. Like at all.

That's the exact same level of idiocy comparing GCN from even Polaris to Vega.

Besides, GCN is not a physical thing, you can't look at GPU schematic and say, see, that's GCN! GCN is a broad name for many things that form up a rendering pipeline (rasterizer, vertex unit, async unit, etc). In Vega, this broad name for many things combined is named NCU. Guess what, because it was changed so much it got a new name. Just like Fermi CUDA cores aren't the same as those on Pascal, so aren't the same those on Vega compared to older designs.
 
The 1080 Ti competes with the Titan XP, so if the high end vega competes with the Ti, then it will compete with Titan.

Speaking of 16 gigs.... I dont see why anybody needs more than 640k.
 
That's the exact same level of idiocy comparing GCN from even Polaris to Vega.

Besides, GCN is not a physical thing, you can't look at GPU schematic and say, see, that's GCN! GCN is a broad name for many things that form up a rendering pipeline (rasterizer, vertex unit, async unit, etc). In Vega, this broad name for many things combined is named NCU. Guess what, because it was changed so much it got a new name. Just like Fermi CUDA cores aren't the same as those on Pascal, so aren't the same those on Vega compared to older designs.

It is the width thing. That is why they are pulling as much power as they do. Pushing them past their efficiency curve isn't helping either.
 
Huh, well that is an interesting naming scheme if it turns out to be true...

Well it does give me a little bit of interest in seeing these come to light if they are the new cards.
 
This isn't AMD'S current cards lol.

What part don't you understand, I said:
"I don't expect it to be even close to 150 Watts, for sure not less in power draw than a GTX1070"

AMD's current cards just draw more power than Nvidia's.
 
I've seen Polaris with dropped voltages consume very little power. I don't know whether it would be on


That is the ONLY time I have EVER seen that happen. Like, ever. I'd definitely need to see it repeated as it goes against all benchmarks I've seen elsewhere. Thank you for posting though.

Edit: Actually, looking at other TPU reviews they have been able to repeat it with many of their 1070's. I'll have to scour the rest of the web to see if anyone else can mimic their results 'cause like I said it goes against everything I've seen over the last year.

1080 isn't that fast. My 980ti is only about 11% off OC to OC at 1440.
 
The R9 Nano was more efficient than everything Nvidia had until the GTX 1080 finally came out.

Not sure where you're getting that. According to the review here

perfwatt_3840.gif
perfwatt_2560.gif
 
The 1080 Ti competes with the Titan XP, so if the high end vega competes with the Ti, then it will compete with Titan.

Speaking of 16 gigs.... I dont see why anybody needs more than 640k.

16GB just of HBC memory. It can address far more than that. Meaning, when it'll reach actual 16GB needed for game and HBC won't be able to utilize any more of actual memory, the framerate drop off will be dramatically lower than when other cards reach their VRAM limit.

Resident Evil 7 was one of such examples if you used the shadow cache which pushed VRAM usage past 4GB and caused GTX 980 to struggle horrendously (even though without that thing, at same visual quality runs the game at like 100+ fps). That's the beauty of HBC system. Memory capacity becomes limitless basically. Sure, there is penalty since speed of external sources doesn't match HBM2, but it's better to have a penalty of few frames than 3/4 less framerate.
 
This thread's a mess....
 
Is everyone gaming at 4K tomorrow? :)

I don't think I'll fill up all my 8GB so soon @1440p.

Because you can simply add another ram module to your GPU if the need arises. ... not.
 
Please show me a single 1070 that's overclocked to match a stock 1080. The 970 can be overclocked to match a 980, but it's common knowledge that the 1080 is leagues above the 1070. Overclocking it closes the gap, but it's still behind by a calculated amount.

My thoughts exactly, the author must not read Techpowerup's own reviews.
 
Not sure where you're getting that. According to the review here

Are you not looking at the 4K benchmark? That's an average using TPU'S roster of games. A roster that includes dubious inclusions like non-working batman games and anno.

Even with the bias, it was up there with Nvidia's most efficient cards. So again - why couldn't AMD do that again?

It would be easier this time too considering Vega is brand new, and the R9 Nano used aging GCN.
 
Paying a premium of $650 for an underclocked Fiji chip was lunacy, hence why they sold like Mein Kampf in Israel.
 
Paying a premium of $650 for an underclocked Fiji chip was lunacy, hence why they sold like Mein Kampf in Israel.

Man, that analogy was so edgy, I nearly cut myself....
 
Man, that analogy was so edgy, I nearly cut myself....

I knew you'd be upset, but I'll try harder if it finally pushes you off the edge.
 
I'm waiting. Not gonna bitch about AMD or NVIDIA to get any attention. I know what I know and I'm comfortable with it. I'm waiting cause I know it is worth to wait.
 
Leaked, leek, lick.....
 
Back
Top