Monday, September 3rd 2018

AMD Fast-tracks 7nm "Navi" GPU to Late-2018 Alongside "Zen 2" CPU

AMD is unique in the world of computing as the only company with both high-performance CPU and GPU products. For the past several years we have been executing our multi-generational leadership product and architectural roadmap. Just in the last 18 months, we successfully introduced and ramped our strongest set of products in more than a decade and our business has grown dramatically as we gained market share across the PC, gaming and datacenter markets.

The industry is at a significant inflection point as the pace of Moore's Law slows while the demand for computing and graphics performance continues to grow. This trend is fueling significant shifts throughout the industry and creating new opportunities for companies that can successfully bring together architectural, packaging, system and software innovations with leading-edge process technologies. That is why at AMD we have invested heavily in our architecture and product roadmaps, while also making the strategic decision to bet big on the 7nm process node. While it is still too early to provide more details on the architectural and product advances we have in store with our next wave of products, it is the right time to provide more detail on the flexible foundry sourcing strategy we put in place several years ago.

AMD's next major milestone is the introduction of our upcoming 7nm product portfolio, including the initial products with our second generation "Zen 2" CPU core and our new "Navi" GPU architecture. We have already taped out multiple 7nm products at TSMC, including our first 7nm GPU planned to launch later this year and our first 7nm server CPU that we plan to launch in 2019. Our work with TSMC on their 7nm node has gone very well and we have seen excellent results from early silicon. To streamline our development and align our investments closely with each of our foundry partner's investments, today we are announcing we intend to focus the breadth of our 7nm product portfolio on TSMC's industry-leading 7nm process. We also continue to have a broad partnership with GLOBALFOUNDRIES spanning multiple process nodes and technologies. We will leverage the additional investments GLOBALFOUNDRIES is making in their robust 14nm and 12nm technologies at their New York fab to support the ongoing ramp of our AMD Ryzen, AMD Radeon and AMD EPYC processors. We do not expect any changes to our product roadmaps as a result of these changes.

We are proud of the long-standing and successful relationships we have built with our multiple foundry partners, and we will continue to strengthen these relationships to enable the manufacturing capacity required to support our product roadmaps. I look forward to providing more details on those innovations as we prepare to introduce the industry's first 7nm GPU later this year and our first 7nm CPUs next year.
Source: AMD Investor Relations
Add your own comment

97 Comments on AMD Fast-tracks 7nm "Navi" GPU to Late-2018 Alongside "Zen 2" CPU

#76
DeathtoGnomes
londisteIt does. For production stuff I would assume the API they use is Optix but point and underlying technology is the same.


No we won't. AMD will have their own API that DXR will use and that'll work. When game devs have implemented straight-up Gameworks stuff, then all bets are off of course. But even then AMD will help them out in that regard :)
AMD doesnt have their own API as you put it, they went open source, google Vulkan API. They are not like Nvidia hiding behind vanity and greed when their API is closed. Developers can change Vulkan API to suit their needs.
Posted on Reply
#77
TheoneandonlyMrK
notbNo problem with that. I'm not changing my opinions. I've criticized Vega for HBM2 dependency and I feel comfortable criticizing this whole 7nm nonsense.
You know how this ends, right? Nvidia will give us consumer grade 7nm GPUs before AMD (I mean available products, not fairy tales).

Actually, there's still a decent probability that Intel delivers their 10nm whatever-lake lineup before 7nm Zen2 and that would be really funny. ;-)
Susan, Your criticism is irrelevant, you know only what you read, im v happy with my waterblocked vega I actually use.
Stop trolling and read up, Nvidia use a special proprietary 12nm node for turing, you think that design can just be popped over to the 7nm machine.

Don't change your opinion , who the F cares.
Posted on Reply
#78
notb
DeathtoGnomesAMD doesnt have their own API as you put it, they went open source, google Vulkan API. They are not like Nvidia hiding behind vanity and greed when their API is closed. Developers can change Vulkan API to suit their needs.
What is Nvidia hiding behind? Man... you're really lost in this, aren't you? :-)
Vulkan is a gaming API and an open-source competitor to a proprietary DirectX... by Microsoft.
You do know that Nvidia is also a member of Khronos Group, right? :-D

Optix mentioned by @londiste is a general-use ray tracing API.

The only gaming-related commercial Nvidia software I could think of is GameWorks. While it's not free, it's definitely open-source since 2016. AMD's rivaling software is GPUOpen (free).
Posted on Reply
#79
londiste
DeathtoGnomesAMD doesnt have their own API as you put it, they went open source, google Vulkan API. They are not like Nvidia hiding behind vanity and greed when their API is closed. Developers can change Vulkan API to suit their needs.
AMD's ProRender is direct competitor for Nvidia's Optix.
Vulkan is a (gaming-oriented) API. How RT will be implemented there is a bit of an open question at this point. Probably (vendor-specific) extensions.
DXR is part of DX12 API, RTX is Nvidia's implementation for this.
Posted on Reply
#80
notb
theoneandonlymrkyou know only what you read
As opposed to you believing in what you feel?
In other words: I'm an extreme positivist and you're a religious zealot?
I can live with that! :-D
im v happy with my waterblocked vega I actually use.
You, being very happy with your Vega, really fits well into the whole picture.
Nvidia use a special proprietary 12nm node for turing, you think that design can just be popped over to the 7nm machine.
OMG! Now it's a proprietary 12nm node! And is AMD using an open-source node?
I can already imagine this TSMC plant with a thick line on the floor dividing it into proprietary and open-source parts! And every time a worker crosses it into the "proprietary" part, a fairy loses wings...
Posted on Reply
#81
ShurikN
Regarding 12nm, TSMC stated that there should be zero performance/clock uplift compared to 16nm. It's mostly power consumption and of course smaller dies
Posted on Reply
#82
TheoneandonlyMrK
notbAs opposed to you believing in what you feel?
In other words: I'm an extreme positivist and you're a religious zealot?
I can live with that! :-D

You, being very happy with your Vega, really fits well into the whole picture.

OMG! Now it's a proprietary 12nm node! And is AMD using an open-source node?
I can already imagine this TSMC plant with a thick line on the floor dividing it into proprietary and open-source parts! And every time a worker crosses it into the "proprietary" part, a fairy loses wings...
Not at all what i feel, what i know and have experience in, I've owned and gamed on a vega 64 ,rx580 gtx1060 gtx 1080ti and you are positive about NVIDIA only hint(borderline worship);)

Presently own four Nvidia cards and seven Amd not one card ,not biased ownership;).

And Nvidia worked with tsmc for that exclusive proprietary 12nm node, It Is just for Nvidia in consumer gpu land by legal privilege.
Posted on Reply
#83
medi01
notbYou do know that Nvidia is also a member of Khronos Group, right? :-D
You know how OpenGL lost to DirectX and I mean nVidia Greed's role in it in particular, right?
Posted on Reply
#85
notb
theoneandonlymrkNot at all what i feel, what i know and have experience in, I've owned and gamed on a vega 64 ,rx580 gtx1060 gtx 1080ti and you are positive about NVIDIA only hint(borderline worship);)
Yes. I prefer Nvidia and I buy Nvidia. What's wrong with that?
Presently own four Nvidia cards and seven Amd not one card ,not biased ownership;).
So one has to own 11 GPUs to be able to state his preference? You didn't believe in reviews and benchmarks? Can't read? Had to try them all yourself? :-D
And Nvidia worked with tsmc for that exclusive proprietary 12nm node, It Is just for Nvidia in consumer gpu land by legal privilege.
Again: OMG. A company makes a product and they don't want to share their technology and production lines! That's totally unheard of!
Man, you are so fun!
medi01You know how OpenGL lost to DirectX and I mean nVidia Greed's role in it in particular, right?
I think you're talking about a different event than the one I remember. ;-)
OpenGL lost in late 90s, because it was promoted by a weak company struggling to find a place on quickly changing market (they went bankrupt few years later).
DirectX, on the other hand, was supported by a much larger and growing company, which was driving the actual change happening.

It wasn't DirectX that won. It was Windows. :-)

I have no knowledge about Nvidia's impact on that events.
Riva TNT was great, but I don't think they had the position required to impose anything. ;-)
And I had a Riva TVT Vanta, so I hope @theoneandonlymrk will let me have an opinion! :-D
Posted on Reply
#86
TheoneandonlyMrK
notbYes. I prefer Nvidia and I buy Nvidia. What's wrong with that?

Bias, heard of it

So one has to own 11 GPUs to be able to state his preference? You didn't believe in reviews and benchmarks? Can't read? Had to try them all yourself? :-D

Mining, i did this SO i could get my hands on tech i would not otherwise own.
Trying before complaining makes sense to me



Again: OMG. A company makes a product and they don't want to share their technology and production lines! That's totally unheard of!
Man, you are so fun!

Just pointing out Facts to counter you saying nodes are universal, they are not.

I think you're talking about a different event than the one I remember. ;-)
OpenGL lost in late 90s, because it was promoted by a weak company struggling to find a place on quickly changing market (they went bankrupt few years later).
DirectX, on the other hand, was supported by a much larger and growing company, which was driving the actual change happening.

It wasn't DirectX that won. It was Windows. :)

I have no knowledge about Nvidia's impact on that events.
Riva TNT was great, but I don't think they had the position required to impose anything. ;-)
And I had a Riva TVT Vanta, so I hope @theoneandonlymrk will let me have an opinion! :-D
You have no knowledge but comment anyway

no keep your opinion to your biased self.

see other bolds Susan ,you get the idea
Posted on Reply
#87
notb
theoneandonlymrkMining, i did this SO i could get my hands on tech i would not otherwise own.
OK. So you're a GPU collector! You wanted to have many cards and you succeeded. Fine.
To be honest, I think there are countless more interesting and productive ways of getting money for PC parts, but again - it's your choice. You decided to try mining.

But the connection between this and having knowledge about them is still a mystery to me.

Few comments ago you said that I "know only what I read". So my knowledge comes from reading and your comes from owning GPUs? :)

You're shifting between trying to intimidate me because I have a cheap GPU and trying to offend me because I read.
I don't think your strategy is a very good one...
Trying before complaining makes sense to me
Well, it doesn't make sense to me. What about hemorrhoids?

I seldom complain about my computers, because I spend a lot of time choosing parts. And I buy those that match my needs.
Honestly, computers are just way too interesting to waste time on trying GPUs. It's just a magic box that makes the monitor shine. Why do you care so much?
Just pointing out Facts to counter you saying nodes are universal, they are not.
Could you please point to the comment when I said that?
You have no knowledge but comment anyway
LOL. Is this a challenge? You can test my knowledge if you want.
At least I know something about GPUs that's older than my GPU. :)
Posted on Reply
#88
TheoneandonlyMrK
notbOK. So you're a GPU collector! You wanted to have many cards and you succeeded. Fine.
To be honest, I think there are countless more interesting and productive ways of getting money for PC parts, but again - it's your choice. You decided to try mining.

But the connection between this and having knowledge about them is still a mystery to me.

Few comments ago you said that I "know only what I read". So my knowledge comes from reading and your comes from owning GPUs? :)

You're shifting between trying to intimidate me because I have a cheap GPU and trying to offend me because I read.
I don't think your strategy is a very good one...

Well, it doesn't make sense to me. What about hemorrhoids?

I seldom complain about my computers, because I spend a lot of time choosing parts. And I buy those that match my needs.
Honestly, computers are just way too interesting to waste time on trying GPUs. It's just a magic box that makes the monitor shine. Why do you care so much?

Could you please point to the comment when I said that?

LOL. Is this a challenge? You can test my knowledge if you want.
At least I know something about GPUs that's older than my GPU. :)
Susan ,I can tell from the essays you write your on a keyboard, im on my phone.
So in short stop trying to twist what im saying and missquoting me and I'll do you the same honour.
As for me not reading and you only reading , clearly i was indeed yanking your chain but your definitely biased and hence quite wrong about much including where my knowledge comes from.
I enjoyed living in the home of computation through its birth and consumerisation ,i got involved.
And I am not biased unlike you.
Posted on Reply
#89
RealNeil
AlwaysHopeDr Lisa Su knows what she is doing!
She's been good for AMD so far.
Posted on Reply
#90
medi01
Vayra86Polaris was an attempt to attack the midrange but guess what, it can't even cover that anymore these days
Because.... those new 2xxx cards at laughable prices are "midrange" somehow.
notbOpenGL lost in late 90s, because it was promoted by a weak company struggling to find a place on quickly changing market (they went bankrupt few years later).
That poisongreen world of yours...

The turning point was 2001, when Microsoft came with DirectX 8.
OpenGL lost when nGreedia took hopeless (but greedy, at least they are consistent in their greediness, one can give them that)
"my own extension incompatible with competitor" way with addressing shaders in DirectX.
notb...I have no knowledge about...
It's obvious.
Posted on Reply
#91
x86overclock
AlwaysHope"AMD is unique in the world of computing as the only company with both high-performance CPU and GPU products."

Dr Lisa Su knows what she is doing!
Navi was mentioned at the top but you're right, it doesn't say Navi is coming in December 2018
Posted on Reply
#92
londiste
medi01The turning point was 2001, when Microsoft came with DirectX 8.
OpenGL lost when nGreedia took hopeless (but greedy, at least they are consistent in their greediness, one can give them that)
"my own extension incompatible with competitor" way with addressing shaders in DirectX.
That's a very narrow approach you have there.

SGI had no interest in getting OpenGL to games or consumer 3D acceleration at first. It woke up when 3D accelerator cards started to boom. At that point, Glide ruled the roost and control had to be wrangled away from 3dfx. SGI still did not care but Microsoft started to - D3D5 and later D3D7 were what did it. OpenGL remained in the competition mainly because both id and Epic wanted to support cross-platform and were a bit idealistic (and their engines were overwhelming in the market).

Extensions dealing with shaders (for ATI as well) were due to simple fact that OpenGL was stagnating. Shaders really became a thing with OpenGL 2.0 around the end of 2004. The stagnation of OpenGL was primary reason for transferring the control over OpenGL to Khronos group in 2006.
Posted on Reply
#93
Berfs1
ShurikNYeah and? Where does it mention that that particular GPU, releasing later this year, will be Navi?
I’m not gonna read the whole article for you, go to paragraph three and stop annoying everyone, you blind illiterate.
Posted on Reply
#94
ShurikN
Berfs1I’m not gonna read the whole article for you, go to paragraph three and stop annoying everyone, you blind illiterate.
And where does it say in the third paragraph that Navi is coming 2018. Since i'm illiterate, and you can read, find it for me please.
Posted on Reply
#95
Berfs1
ShurikNAnd where does it say in the third paragraph that Navi is coming 2018. Since i'm illiterate, and you can read, find it for me please.
It says the information throughout the entire article, if you weren’t so impatient. I digress, because I have stated my point multiple times, with clear evidence.
Posted on Reply
#96
x86overclock
robot zombieAgh, we only wish we could all get our hands on some 7nm AMD GPU's right now. If only they could shake up the GPU market like they have the CPU world. The timing for something like that to happen could not be better, for us and for them. Could you imagine?

I don't know why, but it just dawned on me that AMD's got me good at this point. If their 7nm CPU's make it to AM4 and my x370 will take it, I know already that I'll buy one, even if I don't need it. At that point, they'll have sold me one CPU from each Zen generation. And the only reason I would think to even do that is because of how affordable their CPU's generally are and just the simple fact that you don't have to change boards each gen. Much lower barrier to entry makes one less likely to stop and think... ...just spend the little chunk of cash, drop it in, and enjoy the gains. It's almost too easy to upgrade. It's brilliant. The reality is that this way, I actually spend more with them than I would have been able to justify with Intel.
It's true you won't need to change your board to upgrade to the latest AMD CPU, but keeping an older motherboard such as the 370X you are missing out on the newer features that the CPU has that increases their performance like XFR 2.0 that was designed for the Ryzen 2000 series. I have the Threadripper 1950X on a X399 motherboard and it is a great processor, but the 2950X is much faster in I.P.C. but won't show it's true potential until the X499 motherboard chipset. is released.
Posted on Reply
#97
nemesis.ie
Berfs1It says the information throughout the entire article, if you weren’t so impatient. I digress, because I have stated my point multiple times, with clear evidence.
Except it doesn't actually say Navi is coming in 2018.
Posted on Reply
Add your own comment
Apr 19th, 2024 18:38 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts