• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Earnings Call: GPU Production is Ramping and Mobile GPUs are Set to Arrive Later This Quarter

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,087 (1.09/day)
The current supply of graphics cards has been very tight all over the world. Starting with the launch of the latest Radeon RX 6000 series of GPUs based on RDNA 2 architecture, AMD has found itself in big trouble when it comes to supply of the silicon, compared to the demand that exists for these new GPUs. We have discussed that many times in the past and saw that it represents a problem spanning everyone involved in getting the silicon chips to the hands of consumers. On Tuesday, April 27th, AMD held its Q1 2021 earnings call and webcast, where the company executives talked about the company's future, underlying problems, and ways of addressing them.

Among many topics covered in the call, AMD's President and CEO, Dr. Lisa Su, has talked about the GPU supply. According to Dr. Su, the company is ramping the production of its Radeon graphics cards, adding that the mobile Radeon GPU lineup is lurking. Here is a full quote from the earnings call.
Dr. Lisa Su—AMD Q1 2021 Earning Call said:
And we're well-positioned for further growth as we have tripled our commercial notebook design wins with the largest OEMs this year. In graphics, revenue increased by a strong double-digit percentage year over year and sequentially, led by channel sales growth as revenue from our high-end Radeon 6000 GPUs more than doubled from the prior quarter. We introduced our Radeon 6700 XT desktop GPU with leadership 1440p gaming performance in March and are on track for the first notebooks featuring our leading-edge mobile RDNA 2 architecture to launch later this quarter. We expect Radeon 6000 Series GPU sales to grow significantly over the coming quarters as we ramp production.


View at TechPowerUp Main Site
 
Currently the largest impediment for AMD is Apple still sucking up a large portion of 7nm. What Apple is still using 7nm at TSMC for though I don't know other than probably more worthless mobile junk.
 
Even if apple will switch to 5nm you're still dealing with the lack of silicone substrate.
 
From what I recall AMD will get 2x the 7nm allocation towards the second half of this year? As Apple moves move of their chips to 5nm it will free up a lot of capacity!
 
Currently the largest impediment for AMD is Apple still sucking up a large portion of 7nm. What Apple is still using 7nm at TSMC for though I don't know other than probably more worthless mobile junk.

That's really not how it works. Companies book wafer allotments years in advance. TSMC then builds out production according to those needs. Apple can't take allotment from AMD, TSMC is contractually required to give AMD what it paid for. The real problem is the massive spike in demand, not Apple.
 
Unfortunately the ramp is almost horizontal.
 
When will they release cheaper (200-300€) cards?
 
You know what happens after exiting a ramp?

Free fall.
 
I am no longer optimistic. All I see is a lot of waffle.
 
Last year 5500M already in limited availability without pandemic, how would 6700M fare any better?
 
As far as I'm concerned there is no longer a problem of stock, most PC components stores have 6700 XT's in stock and in big numbers but NO ONE WANTS THEM AT 1000$ A PIECE !!!
Most people already lost interest in this, GPU's have become an expensive component that makes sense only for professionals and miners.
 
Its very impressive with how AMD, despite the massive exodus of veteran members from the Radeon group, managed to leapfrog Nvidia this generation being competitive in the high end AND most importantly being more efficient !!!

Seems like the issue was them not having enough money for R&D, but with the massive success of Ryzen they sorted that out

Last year 5500M already in limited availability without pandemic, how would 6700M fare any better?
750 Ti from early 2014 is selling for more than its launch MSRP right now :p
 
Last edited:
Its very impressive with how AMD, despite the massive exodus of veteran members from the Radeon group, managed to leapfrog Nvidia this generation being competitive in the high end AND most importantly being more efficient !!!

Seems like the issue was them not having enough money for R&D, but with the massive success of Ryzen they sorted that out


750 Ti from early 2014 is selling for more than its MSRP right now :p
Dunno, I certainly don't enjoy Radeon software and they're lacking in terms of the features compared to nvidia. I'd jump off 6800xt and would've sttled even for 3070 if I could catch one for normal price in a heartbeat.
 
Its very impressive with how AMD, despite the massive exodus of veteran members from the Radeon group, managed to leapfrog Nvidia this generation being competitive in the high end AND most importantly being more efficient !!!

Seems like the issue was them not having enough money for R&D, but with the massive success of Ryzen they sorted that out


750 Ti from early 2014 is selling for more than its launch MSRP right now :p
They lost the fat, the master of smoke and mirrors left to practice his tricks at Intel while leaving behind a team that may have been forced to create Vega.

AMD still needs to be bought out through and with a influx of a few hundred Billion to bring them to complete fruition, they have some great products that are overtly competitive but a lack of capital to push their tech beyond the market share they have clawed into. It sucks for them honestly, so many unprofitable years and lack of resources that a great management team and awesome core staff have managed to make work, but a lack of capital to push the pendulum to the other side. Imagine if they had all the 7nm and production room they could handle, they could sink two whole production runs of Intel hardware and a single one of Nvidia. They could become market dominant for once in the last decade.

If they keep up their hard work for another few years with winning products they might be close enough.
 
Dunno, I certainly don't enjoy Radeon software and they're lacking in terms of the features compared to nvidia. I'd jump off 6800xt and would've sttled even for 3070 if I could catch one for normal price in a heartbeat.

I'm not sure lacking in features is the right word, lacking in polish perhaps. Sure they don't have a DLSS competitor but if their alternative turns out better I'd rather wait. DLSS isn't even in enough games right now to be a factor anyways. I really don't think a proprietary feature in the GPU space that has to be implemented on a per game basis has historically made it long term either.

Architecturally though, AMD has indeed made large strides.

I personally wouldn't touch a 3070 with a ten foot pole. 8GB of VRAM is not enough. In case anyone is unaware, Windows 10 shows actual VRAM usage (not allocation) in the task manager and has done so for a year now.
 
Given the way Apple has treated their customer base for the last 25 years, it is hard to believe this company is still in business.

Nobody drops support for existing hardware the way Apple does.

Nobody spends a decade and a half delivering under-powered hardware the is not upgrade-able the way Apple does.

Nobody redesigns software with disregard to current user workflow the way Apple does.


Apple - a fashion statement for non-users
 
I'm not sure lacking in features is the right word, lacking in polish perhaps. Sure they don't have a DLSS competitor but if their alternative turns out better I'd rather wait. DLSS isn't even in enough games right now to be a factor anyways. I really don't think a proprietary feature in the GPU space that has to be implemented on a per game basis has historically made it long term either.

Architecturally though, AMD has indeed made large strides.

I personally wouldn't touch a 3070 with a ten foot pole. 8GB of VRAM is not enough. In case anyone is unaware, Windows 10 shows actual VRAM usage (not allocation) in the task manager and has done so for a year now.
You might be right. Polish is correct word. raw power is great, but I’m still on a fence on whether to keep it relax or still try to catch/trade for 3080.
 
As far as I'm concerned there is no longer a problem of stock, most PC components stores have 6700 XT's in stock and in big numbers but NO ONE WANTS THEM AT 1000$ A PIECE !!!
Most people already lost interest in this, GPU's have become an expensive component that makes sense only for professionals and miners.

The gen is ageing already, its past halfway until the next releases are going to get press.

This is fast going to Turing Super timing: when it gets a bit more acceptable, its already old news. Part of that is due to limited VRAM.

Given the way Apple has treated their customer base for the last 25 years, it is hard to believe this company is still in business.

Nobody drops support for existing hardware the way Apple does.

Nobody spends a decade and a half delivering under-powered hardware the is not upgrade-able the way Apple does.

Nobody redesigns software with disregard to current user workflow the way Apple does.


Apple - a fashion statement for non-users
Apple found a way to set itself apart and still apparently does. Software/hardware integration is the key word. Using the stuff, I can see both its merit and the drawbacks/limitations. Its good there is actual choice in the matter. I like Windows for its flexibility too. But this is a topic on none of that, right?

Did you mistake the A in AMD for fruit?
 
Last edited:
Took me a minute... at first I thought you were serious :roll:
Sadly i was, just current situation and lately AMD's pricing is the joke.
 
750 Ti from early 2014 is selling for more than its launch MSRP right now :p

Yeah, blame AMD for that lacks of competition.

Given the way Apple has treated their customer base for the last 25 years, it is hard to believe this company is still in business.

Nobody drops support for existing hardware the way Apple does.

Nobody spends a decade and a half delivering under-powered hardware the is not upgrade-able the way Apple does.

Nobody redesigns software with disregard to current user workflow the way Apple does.


Apple - a fashion statement for non-users

Frank, are you ok?

Sadly i was, just current situation and lately AMD's pricing is the joke.

I think they price it accordingly.

relative-performance-cpu.png


Outperforms 8 cores and close to 10 cores , sure you don't want to charge your product the same $ 320 as the Intel 10700K?
 
Outperforms 8 cores and close to 10 cores , sure you don't want to charge your product the same $ 320 as the Intel 10700K?

OP mentioned “cards” so GPUs (not cpus - which I would also argue are overpriced. But that is using my measuring stick from the “before times”).
 
OP mentioned “cards” so GPUs (not cpus - which I would also argue are overpriced. But that is using my measuring stick from the “before times”).

Yes, I saw that. Just wondering where he got the idea blaming a certain company in such a situation. MSRP means nothing since last year. The other keyword is "lately," so I'm expecting it has something to do with recent hardware launch, in this case Rocket Lake.
 
Dunno, I certainly don't enjoy Radeon software and they're lacking in terms of the features compared to nvidia. I'd jump off 6800xt and would've sttled even for 3070 if I could catch one for normal price in a heartbeat.
I'd rather have the sane amount of VRAM on the RX 6800, was forced to go with 3070 because no 6800 stock & they were all over 1000 EUR. Don't care about Nvidia's shiny features.
 
The gen is ageing already, its past halfway until the next releases are going to get press.

This is fast going to Turing Super timing: when it gets a bit more acceptable, its already old news. Part of that is due to limited VRAM.


Apple found a way to set itself apart and still apparently does. Software/hardware integration is the key word. Using the stuff, I can see both its merit and the drawbacks/limitations. Its good there is actual choice in the matter. I like Windows for its flexibility too. But this is a topic on none of that, right?

Did you mistake the A in AMD for fruit?
No, I was replying to the apple comments above. Hit the wrong reply button. oops.
 
Last edited:
Back
Top