• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Reports First Quarter 2019 Financial Results- Gross margin expands to 41%, up 5 percentage points year-over-year

No. IMO Ryzen 3000 will be out on June 7th, this is why all the MB teasing X570.

I'm not sure if I'd call that "very high", but certainly it's an advantage of the more modern architecture.
Look it TH review, Epyc uses 16x memory dims vs 12 on Xeon 8280:
Now when you look at large scale Data Center- this is big, before even looking at 64C Rome.
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS85L1YvODM0OTc5L29yaWdpbmFsL0ltYWdlMi5wbmc=
 
No, not to compete Xeon G lineup, remember it's EPYC, the question was more like if they will add GPU chiplete so the Compute power will be improved, so CPU and GPU compute units on single package-and you got a lot of space on EPYC :)..
This is very interesting IMO.
What is the point of such combo today, when everything is "in cloud"? Shared GPU-CPU mem? That's too situational.
 
Yeah, they stopped. Only twice more Ryzens are sold compared to Intel. :D
LOL "won't be much better, up to 15% in best case" while Intel getting CPUs gen by gen by a 5% jump.

From what I read, yes they arent flying off the shelves in the consumer segment. The rest of that comment, well... :p

AMDs growth happens in datacenters.
 
What is the point of such combo today, when everything is "in cloud"? Shared GPU-CPU mem? That's too situational.
Workstations.
No. IMO Ryzen 3000 will be out on June 7th, this is why all the MB teasing X570.
We'll see how this goes. Maybe they'll show just few models.

Looking at just a single CPU, moving the <=8C variants to 7nm makes little sense (slightly lower power consumption, not a big deal if everyone accepts/likes how much Zen+ pulls).

So it's down to the issue of yields and using bad chiplets for low-core SKUs. Does it save them enough money to cover whatever TSMC asks? Only AMD knows this.

Given how expensive 7nm is, my guess is that they would make more by keeping the Zen+ lineup for everything that doesn't need chiplets. They could add PCIe4.0 and call it Zen++.

>8C Ryzen (this year) and >4C APU (next year?) would be Zen2.
And as 7nm gets cheaper, AMD will start fresh on a new socket (late 2020 or 2021).
Look it TH review, Epyc uses 16x memory dims vs 12 on Xeon 8280:
Now when you look at large scale Data Center- this is big, before even looking at 64C Rome.
When you look at cloud, peak power consumption is not the statistic you want to see.
Xeon CPUs use more power, but are faster. The key information is how much time and energy is needed to complete a task ("time to solve" and "energy to solve").
There's no clear winner in this. In many scenarios EPYC has 10% advantage. In some Intel pulls ahead thanks to lower latency and more robust instruction set.
There are way to many variables to say that one server CPU is better than another in general.

Also, I think we focus on power draw way to much. Over 3 years a typical server will consume energy worth 10% of the hardware.

Remember that some server-specific software is licensed per core (databases being the most significant). We don't think about it often, but simple fact is: if you need 32C EPYC to match a 28C Xeon, the license cost stemming from 4 extra cores may eat all savings an EPYC may provide.
 
So, if we take into account that Rome is supposed to be released before, does it mean that there will be no Zen 2 consumer grade CPUs to at least Q4?

This is what I want to know. Surely ROME will come first any time from July to September (Q3). So at best Ryzen 3000 coming July as well. Or what is the precedent for desktop releasing first with AMD?
 
This is what I want to know. Surely ROME will come first any time from July to September (Q3). So at best Ryzen 3000 coming July as well. Or what is the precedent for desktop releasing first with AMD?
Why are you so sure Rome will be launched first?
 
ppl stop buying Ryzens, waiting for next gen on 7nm.
Unfortunatelly, it won't be much better, up to 15% in best case
15% in IPC, much more in total. Many (reliable) rumors indicate that the mid-range Ryzen 7nm (so Ryzen 5) beating the high-end 9900K.
 
AMD's entire quarterly profit is just a couple times more than Lisa Su's total compensation for that period. :laugh:
 
AMD's entire quarterly profit is just a couple times more than Lisa Su's total compensation for the year. :laugh:
I fixed that for you. Her total compensation for a year is more like 10.8M USD. Quarterly profit was 38M, so my correction to your statement is a tad more accurate, you know, probably by a factor of 4 because you're comparing quarterly profits to yearly compensation.
 
More fuel that GPU "Next-Gen" architecture can see light first-half of 2020, and will be a Enthusiast offerings just using GDDR6. The diverging professional segment looks to be holding with Vega/Ellesmere for a while longer, and being the flag-bearer for HBM.

It's been rocky for HBM, not completely necessary for consumer product/markets when first envisioned. Though as GDDR5 going longer than anticipated it made some sense, knowing a DDR6 spec was only somewhat "agreed" at that point(2014-15). I see that AMD/RGT wanted to maintain development (HBM) being big for emerging AI and Data Centers; selling idea as "gaming" was a help internally to keep the engineering movement/funds flowing, all while graphic budgets were scrutinized and cut.
 
HBM isn't really a problem, there were low cost variants that were supposed to come out & I guess that never happened in a big way? Also the HBM demand is soaring from all corners, so that's unlikely to be cheap anytime soon. Lastly interposer, until we move to 3d packaging ~ HBM+interposer will be a costly hit or miss endeavor.
 
It's been rocky for HBM, not completely necessary for consumer product/markets when first envisioned. Though as GDDR5 going longer than anticipated it made some sense, knowing a DDR6 spec was only somewhat "agreed" at that point(2014-15). I see that AMD/RGT wanted to maintain development (HBM) being big for emerging AI and Data Centers; selling idea as "gaming" was a help internally to keep the engineering movement/funds flowing, all while graphic budgets were scrutinized and cut.
It has a huge amount of potential where physical space is at a premium. I would love to see APUs with HBM used for both video and system memory. It would eliminate the need the off-chip DRAM. If you think about it most ultrabooks and a lot of laptops now have memory soldered to the motherboard, something like this could save even more space that could be used for battery instead. Even with my Vega 64, I'm astonished at how bare the card itself is when it's not loaded up with DRAM chips. The problem is that HBM is very costly to implement. You really need to gain something significant by using it, otherwise it just increases costs.
 
Last edited:
You really need to gain something significant by using it, otherwise it just increases costs.
Oh I'm not saying it's only (ever) for Professional GPU segments, or that we'll never see return to the consumer space, but correct as you say, you need to see significant gains vs. cost. Right today... gaming is not seeing such gains easy to market or justify (it turned out it was a hard sell). That said, a Mini-ITX (aka Nano) can be one of those places, but still a very limit market to justify. As you say custom designs and even consumer APU's could be a place HBM see's vast new territory to explore. So yes we can't wholesale write-off HBM, it just may find new use's if 5K, HDR, DXR or later Path Tracing justify it in the graphics space.

Perhaps when a new Radeon Pro Series Graphics card comes in a new architecture, and they can offer it as both do (Vega 7 or Titan V) we will again see a true AMD "Halo" product for consumers.
 
Last edited:
How is not being able to separately upgrade expensive CPU and GPU a feature?
If you need GPGPU, you'll obviously buy a workstation with dedicated graphics cards.

Xeons with integrated GPUs are for workstations that don't utilize GPGPU computing. It's only there for video output.
Apart from saving money, it makes it possible to do this:
hp-z2-mini-g4vertical-monitor.jpg
 
Hoping this isn't a case of Introlling.
A potential 15% increase is still MUCH better than Intel was dishing out with each chip refresh while requiring a new board for it too.
Sometimes even a 5% increase wasn't truly realized - Even a 7% increase beats what Intel had been doing.
from conference call: Navi in Q3, will be placed under VII from pricing standpoint, Epyc Rome in Q3 (shipping in Q2) and AMD even thinking on EPYC APU in future(?)
On Navi raytracing- Lisa didn't want to say- more info only near launch
Jeez...just shows how bad RTG is doing right now. A new arch or finally fixing GCN is very much needed.

Some insight to AMDs market strategy. The end user (not business) are last on priority list. Navi is a mid range product.


Heres more info
 
Last edited:
Back
Top