Tuesday, June 5th 2018

AMD Demonstrates 7nm Radeon Vega Instinct HPC Accelerator

AMD demonstrated the world's first GPU built on the 7 nanometer silicon fabrication process, a Radeon Vega Instinct HPC/AI accelerator, with a 7 nm GPU based on the "Vega" architecture, at its heart. This chip is an MCM of a 7 nm GPU die, and 32 GB HBM2 memory stacks over four stacks (4096-bit memory bus width). It's also the first product to feature a removable InfinityFabric interface (competition to NVIDIA's NVLink interface). There will also be variants based on the common PCI-Express 3.0 x16. The card supports hardware virtualization and new deep-learning ops.
Add your own comment

100 Comments on AMD Demonstrates 7nm Radeon Vega Instinct HPC Accelerator

#51
T4C Fantasy
CPU & GPU DB Maintainer
londisteNot any more, no :(
W1zzard has it at 486 he measures them too. Now im curious
Posted on Reply
#52
londiste
T4C FantasyW1zzard has it at 486 he measures them too. Now im curious
I am probably wrong then.
Although mine did measure to around 520 and I have played couple of times with images to check sizes before and they ended up in the same vicinity.
Posted on Reply
#53
T4C Fantasy
CPU & GPU DB Maintainer
londisteI am probably wrong then.
Although mine did measure to around 520 and I have played couple of times with images to check sizes before and they ended up in the same vicinity.
We will know soon if vega 20 is 350mm2 xD
Posted on Reply
#54
londiste
I tried googling this a bit to see if I remember wrong and it was measured to around the official size.
- GamersNexus measured it at 20.25 mm × ~26 mm = 526.5 mm²
- PC Perspective measured it at 25.90 mm × 19.80 mm = 512.82 mm²
PCPer's update on the size triggered Raja's tweet.

Do we have another source for the 484 mm² figure?
Closest perfect square to the 510-520 measured sizes would be 23² = 529, wouldn't it?
Posted on Reply
#55
bug
Vya DomusI am sure you know how this works as well. You can bet this will be at least half or maybe even a third of the price of a V100. It's very much competitive where it needs to, actually. And it's not just performance that matters so does software support , Nvidia for a few years refuses to support anything above OpenCL 1.2 whereas AMD does and for these things that sure as hell matters.
Funny thing is, I've seen Nvidia with OpenCL 1.2 beat AMD with OpenCL 2.0.
I'm not sure what the problem with OpenCL is, CUDA should have died a long time ago of OpenCL was any good.
Posted on Reply
#56
T4C Fantasy
CPU & GPU DB Maintainer
londisteI tried googling this a bit to see if I remember wrong and it was measured to around the official size.
- GamersNexus measured it at 20.25 mm × ~26 mm = 526.5 mm²
- PC Perspective measured it at 25.90 mm × 19.80 mm = 512.82 mm²
PCPer's update on the size triggered Raja's tweet.

Do we have another source for the 484 mm² figure?
Closest perfect square to the 510-520 measured sizes would be 23² = 529, wouldn't it?
Pc perspective seems closer, gamersnexus didnt use a very good measuring system for vega at the time

Do you have 1 more source?
Posted on Reply
#57
deu
Vayra86Here we go again.

If this card is 1080ti performance, its already too late. Not interested, and as expected.
To you it might not seem interesting but to many it very well might;

Earlier the gpu-race was about to get to '1440p 60-144fps' ultra (which 1080Ti does pretty well in almost anything on the market)
The "NEXT" challenge is 4K at 144 fps; Not 4K at 60fps+ because 1080Ti already does that too.
So right now the main reason to get a card that performs better than 1080Ti is if you own on of the new 4K 144hz screens (priced at 3000 dollars), or if you have some other graphichintensive purpose that only a 1080Ti+ can solve (but gaming is not one of them.) The thing is that the 4k 144hz screens will take years before they are down in price and until that happens to a growing part of people 1080Ti performance is "good enough" for any game, any screen out there (unless you are ultrawidescreen 4096x1440 or something). My be is that we will see a slowdown in performance leaps for some years and a more competitive focus on price. So the big question is; can AMD price their GPU competitive and can they keep miners away from them?

I have a 1440p 165 hz screen with gsync and a 1080Ti. If I got a card that was twice as good = cool, but in almost any game i would never notice since my fps would be 100+ and with gsync on anyway. Im not arguing your opion, im just saying that I think alot of other people will puts value for money first (and as said before the focus will shift back to value for money since we are in a valley where king of the hill performance does not "take any hills the others will not be able to climb anyway" (you can quote me on the last part that was brilliant of me....)...
Posted on Reply
#58
T4C Fantasy
CPU & GPU DB Maintainer
deuTo you it might not seem interesting but to many it very well might;

Earlier the gpu-race was about to get to '1440p 60-144fps' ultra (which 1080Ti does pretty well in almost anything on the market)
The "NEXT" challenge is 4K at 144 fps; Not 4K at 60fps+ because 1080Ti already does that too.
So right now the main reason to get a card that performs better than 1080Ti is if you own on of the new 4K 144hz screens (priced at 3000 dollars), or if you have some other graphichintensive purpose that only a 1080Ti+ can solve (but gaming is not one of them.) The thing is that the 4k 144hz screens will take years before they are down in price and until that happens to a growing part of people 1080Ti performance is "good enough" for any game, any screen out there (unless you are ultrawidescreen 4096x1440 or something). My be is that we will see a slowdown in performance leaps for some years and a more competitive focus on price. So the big question is; can AMD price their GPU competitive and can they keep miners away from them?

I have a 1440p 165 hz screen with gsync and a 1080Ti. If I got a card that was twice as good = cool, but in almost any game i would never notice since my fps would be 100+ and with gsync on anyway. Im not arguing your opion, im just saying that I think alot of other people will puts value for money first (and as said before the focus will shift back to value for money since we are in a valley where king of the hill performance does not "take any hills the others will not be able to climb anyway" (you can quote me on the last part that was brilliant of me....)...
1080 ti doesnt do 60fps in 4k in any new titles, ff15 42fps max settings FC5 50~fps
if you buy a 1080ti only max settings are acceptable xD
Posted on Reply
#59
londiste
I think what Vayra86 meant was that 1080Ti has been out for over a year now. Effectively, we have that performance level covered.

On maxed-ish settings and new games 1080Ti will not do 1440p@144/165 or UHD@60, it tends to fall short. More performance would still be nice.
At the same time, if either AMD or Nvidia can do what Nvidia did last time with GTX1070 - same performance at 60% TDP - that would be very nice as well.
Posted on Reply
#60
InVasMani
It really seems like 1.66% efficiency and performance increase could be within reach with 7nm Vega or close to it especially taking into account a overclock/bios mod that with 16GB VRAM would be pretty solid, but depends on what Nvidia puts out as well and at what price point.
Posted on Reply
#61
londiste
T4C FantasyPc perspective seems closer, gamersnexus didnt use a very good measuring system for vega at the time
Do you have 1 more source?
Agree on Gamersnexus part. PCPer, especially after corrections seems close to what i would expect it to be. I know I have seen the size being measured in several other places but my google-fu is betraying me right now.

Although I do quite trust this guy's measurements. Granted, this is Engineering sample but they cannot be that different, can they?
die-size 26,10mm × 19,53mm (509,73mm²)
[MEDIA=flickr]24FgMzM[/MEDIA]
Posted on Reply
#62
T4C Fantasy
CPU & GPU DB Maintainer
londisteAgree on Gamersnexus part. PCPer, especially after corrections seems close to what i would expect it to be. I know I have seen the size being measured in several other places but my google-fu is betraying me right now.

Although I do quite trust this guy's measurements. Granted, this is Engineering sample but they cannot be that different, can they?

[MEDIA=flickr]24FgMzM[/MEDIA]
They are all the same so 510 must be the perfect square?
Posted on Reply
#63
londiste
Well, that perfect square is BS either way. 486 is the official number according to Vega whitepaper (page 2, top right).
Does not change the fact that everyone seems to measure it to a slightly larger size.
Posted on Reply
#64
T4C Fantasy
CPU & GPU DB Maintainer
londisteWell, that perfect square is BS either way. 486 is the official number according to Vega whitepaper. That's where TPU DB got the size, I assume?
Does not change the fact that everyone seems to measure it to a slightly larger size.
Well the die is a bit bigger than the silicon die for sure ill put it as 510
Posted on Reply
#65
TheinsanegamerN
OH WOW!!! 4096 BITS?!?!? THATS....just like the fury X, which flopped. Who cares about the number of bits on a data bus? Thats like caring about the number of cores on a GPU disregarding architectures (E.G. My gpu has 4000 cores, yours only has 3000! So mine is better! Ignore they are two completely different GPUs). Performance is what matters, not hardware numbers, and if AMD is advertising bus width again, it might mean their chips are not performing as intended yet.
JismOther websites reported, that the 7nm vega is twice as efficient compared to the 14nm. And that the performance over the previous one is around 35% more. So we're looking at 1080Ti levels here.

Edit: god, lisa is so sexy with that GPU.
And knowing AMD, it will launch after the 1180TI, when the 1280TI is about to launch.
Posted on Reply
#66
Dalai Brahma
JismOther websites reported, that the 7nm vega is twice as efficient compared to the 14nm. And that the performance over the previous one is around 35% more. So we're looking at 1080Ti levels here.

Edit: god, lisa is so sexy with that GPU.
Is Jackie Chan with that GPU ?! Is he new CEO??.. lol



.
Posted on Reply
#67
T4C Fantasy
CPU & GPU DB Maintainer
TheinsanegamerNOH WOW!!! 4096 BITS?!?!? THATS....just like the fury X, which flopped. Who cares about the number of bits on a data bus? Thats like caring about the number of cores on a GPU disregarding architectures (E.G. My gpu has 4000 cores, yours only has 3000! So mine is better! Ignore they are two completely different GPUs). Performance is what matters, not hardware numbers, and if AMD is advertising bus width again, it might mean their chips are not performing as intended yet.


And knowing AMD, it will launch after the 1180TI, when the 1280TI is about to launch.
Considering the clockspeed is 1200 on 4096bit it actually does matter, thats i think 300mhz higher with 2x bus than vega 10

And fury was like 500mhz with 4gb hbm huge difference
Posted on Reply
#68
Vayra86
deuTo you it might not seem interesting but to many it very well might;

Earlier the gpu-race was about to get to '1440p 60-144fps' ultra (which 1080Ti does pretty well in almost anything on the market)
The "NEXT" challenge is 4K at 144 fps; Not 4K at 60fps+ because 1080Ti already does that too.
So right now the main reason to get a card that performs better than 1080Ti is if you own on of the new 4K 144hz screens (priced at 3000 dollars), or if you have some other graphichintensive purpose that only a 1080Ti+ can solve (but gaming is not one of them.) The thing is that the 4k 144hz screens will take years before they are down in price and until that happens to a growing part of people 1080Ti performance is "good enough" for any game, any screen out there (unless you are ultrawidescreen 4096x1440 or something). My be is that we will see a slowdown in performance leaps for some years and a more competitive focus on price. So the big question is; can AMD price their GPU competitive and can they keep miners away from them?

I have a 1440p 165 hz screen with gsync and a 1080Ti. If I got a card that was twice as good = cool, but in almost any game i would never notice since my fps would be 100+ and with gsync on anyway. Im not arguing your opion, im just saying that I think alot of other people will puts value for money first (and as said before the focus will shift back to value for money since we are in a valley where king of the hill performance does not "take any hills the others will not be able to climb anyway" (you can quote me on the last part that was brilliant of me....)...
Yup. Cool story bro. Meanwhile there are even 1440p games where 1080ti falls short, but keep living the dream. And next year, there will be lots more.

Stagnation in GPU = a step back. And the result of that is that playable performance at all levels becomes more expensive. If GPUs don't get more powerful each gen on EACH tier in the product stack, it means you will pay the same or even more for the same performance as last year and this does not just apply to the 1080ti performance level... Why? Because games do get more demanding over time and they will edge closer to a higher GPU tier every year. Pascal is already out for nearly two years now and there is no news about anything that will top it. Turing and an 1180? Same-ish performance most likely. AMD's Vega? Not a gaming GPU (surely you don't still think they will keep trying that).. and it never was.

Your statement that the next challenge is 4K144fps is complete and utter nonsense. You're talking about a top 1% that might consider that between now and the next 2-3 years. If its even that. It combines a GPU and CPU bottleneck with a near-guarantee that you won't ever comfortably hit that target. If you want to burn everything on computer hardware... Either way, AMD is not going to move us closer to that target with Vega; not with 35% more performance and not even with 50% more performance.

In other news...
wccftech.com/exclusive-amd-navi-gpu-roadmap-cost-zen/

Mind you I don't usually link WCCFtech but this seems legit. Additionally, this article speaks of Radeon Instinct which is not the gaming GPU and everybody knows that if AMD tries to pull another Vega for gamers after the silly show called HBM-based gaming SKUs they have literally lost the plot. The only reason we got Vega in the first place (and the reason it was so badly optimized and still is) is for AMD to have a bucket to toss the leftovers into that didn't make it into Frontiers or MI25 cards. Because those do sell for 2-3x the price. This was already clear months ago, but nobody wanted to believe it (blaming lack of cards on 'miners and HBM supply', when in reality the cards never even got to the marketplace).... And here we are, today.
Posted on Reply
#70
T4C Fantasy
CPU & GPU DB Maintainer
Vayra86Yup. Cool story bro. Meanwhile there are even 1440p games where 1080ti falls short, but keep living the dream. And next year, there will be lots more.

In other news...
wccftech.com/exclusive-amd-navi-gpu-roadmap-cost-zen/

Mind you I don't usually link WCCFtech but this seems legit.
some of it is anyways, Vega wasnt built for Apple by any means, all desktop vegas came before the Radeon Pro Mac Editions, and Frontier Edition was like a desktop workstation hybrid thing lol.
Navi is not being developed strictly because of PS5 either it just so happens to be the next gpu sony will use because it has to lol.
Casecuttervideocardz.com/76487/amds-7nm-vega-is-much-smaller-than-14nm-vega

i dont like it when tech sites stretch the truth and add personal opinions to it. (talking out of your ass) xD
Posted on Reply
#71
Vayra86
T4C Fantasysome of it is anyways, Vega wasnt built for Apple by any means, all desktop vegas came before the Radeon Pro Mac Editions, and Frontier Edition was like a desktop workstation hybrid thing lol.
Navi is not being developed strictly because of PS5 either it just so happens to be the next gpu sony will use because it has to lol.



i dont like it when tech sites stretch the truth and add personal opinions to it. (talking out of your ass) xD
True, but those are all silly and 'flexible' details to me, the gist of the article is the change of focus for RTG/AMD and the fact that change already started when they struck their console deals. It makes sense, 100% plausible and it shows in AMD's product portfolio, its results, and its roadmap.

This also puts Raja's PR moves in a whole other light: the man was literally just lying to us and he knew damn well he did. 'Poor Volta'... aimed at the gamer crowd, remember.
Posted on Reply
#72
T4C Fantasy
CPU & GPU DB Maintainer
well
Vayra86True, but those are all silly and 'flexible' details to me, the gist of the article is the change of focus for RTG/AMD and the fact that change already started when they struck their console deals. It makes sense, 100% plausible and it shows in AMD's product portfolio, its results, and its roadmap.

This also puts Raja's PR moves in a whole other light: the man was literally just lying to us and he knew damn well he did. 'Poor Volta'... aimed at the gamer crowd, remember.
amd said during the vega 20 announcement that the graphics team focused on server side of things and left gaming behind and thats why progress slowed down, dont think that has anything to do with consoles, they also said during computex that they are turning it around and now releasing gpus every year.

its plausible sure that the console part did what it did.
Posted on Reply
#73
Vayra86
T4C Fantasywell
amd said during the vega 20 announcement that the graphics team focused on server side of things and left gaming behind and thats why progress slowed down, dont think that has anything to do with consoles, they also said during computex that they are turning it around and now releasing gpus every year.

its plausible sure that the console part did what it did.
Its easy to release a GPU every year. Look how long they dragged Pitcairn along, or how many rebrands we've seen in GCN. But releasing an INTERESTING GPU every year? Never. Gonna. Happen. They've already said themselves they can't push all the buttons at the same time. It would be silly if we still believe they could anyway, despite the current state of affairs on GPU performance.
Posted on Reply
#74
T4C Fantasy
CPU & GPU DB Maintainer
Vayra86Its easy to release a GPU every year. Look how long they dragged Pitcairn along, or how many rebrands we've seen in GCN. But releasing an INTERESTING GPU every year? Never. Gonna. Happen. They've already said themselves they can't push all the buttons at the same time. It would be silly if we still believe they could anyway.
yes but talked about refinements this time not rebrands, but at that t ime the high end did get a yearly release
HD 6970 - December 2010
HD 7970 - January 2012 - June 2012 (GHz)
R9 290X - October 2013

after that it slowed down
R9 FURY X - June 2015
RX Vega 64 - August 2017
Posted on Reply
#75
Vayra86
Yes, I think Hawaii was the wake up call for AMD because it showed an architecture that really was no longer up to snuff for gaming purposes (too hot, too hungry). Fury X was the HBM test case project as a double edged blade to use in the high end gaming segment for one last time, and Vega represents the completed U-turn to new marketplaces and segments (IGP as well), the 56 and 64 gaming versions of it are just bonus.

'Refinements'... that's just a rebrand on a smaller node, right? :P
Posted on Reply
Add your own comment
Apr 25th, 2024 13:28 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts