• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Zen3 to Leverage 7nm+ EUV For 20% Transistor Density Increase

Joined
Jun 3, 2010
Messages
968 (0.26/day)
Intel shut the EUV door on itself so deliberately that they sort of masterminded their own disadvantage. Lasertec inventing light inspection machines that got booked for years on end had them drop even further in the node race. Now, the rest of the manufacturers have a fast path to both DUV which spare a portion of mask verification runtime and EUV which is now even more readily deployed as 5nm. GJ Intel, you have successfully let yourself become a fast follower.
 

bug

Joined
May 22, 2015
Messages
7,561 (4.10/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10

Ascendor81

New Member
Joined
Apr 16, 2019
Messages
3 (0.01/day)
Nope.


Better value, because performance is hit and miss. I have tried and built several Ryzen rigs by now. Performance is not good across the board, hence the price.
Sorry las. But you are wrong, MSI said cust service rep was misinformed, the Ryzen 3000 CPUs are going under evaluation for 300+400 series boards.
 
Joined
Dec 6, 2016
Messages
82 (0.06/day)
System Name The RIG MK III
Processor AMD Ryzen 2600 @ 4.15GHz / 1.41v
Motherboard Gigabyte B450 Aorus M
Cooling Thermaright "Le Grande Macho"
Memory 8GB Crucial Ballistix Tactical 3000MHz CL15-16-16-34 @ 3466MHz CL16-17-16-34
Video Card(s) Asus VEGA 64 Gamer Strix OC
Storage WD Green 240GB | Crucial MX500 512GB M.2 | 2x 1TB 2.5" Hitachi 7200rpm | 2TB 2.5" Toshiba USB 3.0
Display(s) Twin 27" Dell u2713h 2k monitors
Case Chieftec Cube CI-01B-OP
Audio Device(s) Creative X-Fi Extreme Audio PCI-E (SB1040)
Power Supply Seasonic Focus+ Gold 750W
Mouse Logitech G600
Keyboard Razer Ornata Chroma
Software Win10 x64 PRO
Huh what? It's a fact. Ryzen performance is hit and miss depending on workload and especially in games (high fps gaming that is).

In tons of applications Ryzen sucks too. Handbrake to mention one.
High fps gaming is a niche market - it speaks mostly to competitive gamers - and out of all competitive games only FPS benefit from it. I've never seen a LOL, SCII or WoT player ask for a high refresh rate display, but all are adamant about Freesync or G-sync. The largest majority of my clients don't do competitive gaming and want 2k or 4k @ 60fps. Very few of them are interested in 120 or 140hz displays (under 10% of my gamer/enthusiast clients). I myself am more partial to high resolution 60fps then 1080p 140hz. The image quality in newer games (like anno 1800 for example) is staggering and 60 fps is fine if frame times are good.
 
Joined
Jun 10, 2014
Messages
2,040 (0.93/day)
I don't really understand what you mean by "re-spin". DDR5 won't come as surprise. CPU makers are taking part in memory development. DDR5 and appropriate CPUs have been developed together and can be launched together.
Yes, obviously all relevant parties have worked on their prototypes for years, in fact most standards are derived from prototypes, and are certainly not created in a vacuum.

But I don't get why so many people are fixated on DDR5, is there really a rush? CPU vendors will make the switch when it's needed and it's ready, not before. And based on the information I've seen, DDR5 doesn't look so good yet in terms of latency.

For mainstream users, dual channel DDR4 2666 MHz is plenty, and if you're a content creator you can always go for a quad-channel HEDT configuration. Memory bandwidth is primarily a bottleneck for heavy server loads, which is why upcoming Ice Lake-SP will feature 8 memory channels per socket. Higher memory bandwidth only really helps if a workload is bottlenecked, and over the past decade memory bandwidth have grown much faster than core speed, so generally you need heavy multithreading to be bottlenecked by memory bandwidth. As of now, DDR4 is supporting up to 3200 MHz, but I don't know if it can be expanded further.

Macs depend on x86 software ported from Windows. This is the reason Apple switched to Intel in the first place. They'll have to make sure every important piece of software is available for ARM. Most isn't.
Well, actually not.
The primary motivation was hardware. PPC promised to replace x86, but failed to do so. Apple needed an architecture capable of scaling from a low-power laptop to a high-end workstation, and only x86 could do that.

In terms of software, OS X have no relation to Windows or its ecosystem at all. The Darwin kernel is a mix of the mach microkernel and BSD Unix, and largely rely on the BSD ecosystem, APIs, compilers etc. while they are now increasingly drifting away from that and creating their own walled garden… Most of Apple's current software is already compatible with ARM, so that's not really a big concern.

But ARM is not close to performant enough to replace x86, and this wouldn't change anytime soon. ARM devices like iPhones and iPads rely heavily on specialized instructions to accelerate workloads, and anything else will perform like crap. If Apple switches completely to ARM it would either mean they focus solely on low-performance "web browsing devices" or a hornet's nest of software patchwork and specialized instructions to be "competitive".

But on an interesting note regarding AMD; Zen was supposed to be the steppingstone up to K12, the new big CPU architecture from AMD, based on ARM. Meanwhile K12 is MiA, and probably already obsolete compared to other ARM designs. So Zen 2, 3…5, is it the backup plan after the failure of K12? Is this why we don't hear anything about a Zen successor yet?
 
Joined
Jun 3, 2010
Messages
968 (0.26/day)
But I don't get why so many people are fixated on DDR5, is there really a rush? CPU vendors will make the switch when it's needed and it's ready, not before. And based on the information I've seen, DDR5 doesn't look so good yet in terms of latency.
Ryzen 3000 comes with bad memory cell readdressing options to improve row refresh intervals without incurring dram corruption.
 
Joined
Jan 16, 2008
Messages
1,083 (0.24/day)
Location
Milwaukee, Wisconsin, USA
Processor i7-3770K
Motherboard Biostar Hi-Fi Z77
Cooling Swiftech H20 (w/Custom External Rad Enclosure)
Memory 16GB DDR3-2400Mhz
Video Card(s) Alienware GTX 1070
Storage 1TB Samsung 850 EVO
Display(s) 32" LG 1440p
Case Cooler Master 690 (w/Mods)
Audio Device(s) Creative X-Fi Titanium
Power Supply Corsair 750-TX
Mouse Logitech G5
Keyboard G. Skill Mechanical
Software Windows 10 (X64)
Macs depend on x86 software ported from Windows. This is the reason Apple switched to Intel in the first place. They'll have to make sure every important piece of software is available for ARM. Most isn't.

Apple could offer an ARM powered Macbook next to x86 one (just like Microsoft does with Surface).
ARM exclusive lineup? Think 2025+. But don't bet your house on it.

ARM servers are not compatible with x86 and - assuming ARM will become a more attractive option at some point (likely!) - it'll take a decade before a significant part of market migrates. And x86 will stay with us for many years.
Both Intel and AMD are thinking about the ARM threat. They'll join if necessary. Don't worry too much. :)
You clearly know nothing about Macs or macOS (Previously Mac OS X).
 
Joined
Mar 18, 2008
Messages
5,077 (1.14/day)
Location
Australia
System Name Night Rider | Mini LAN PC | Workhorse
Processor AMD R7 2700X | Ryzen 1600X | i7 970
Motherboard MSi AM4 Pro Carbon | GA- | Gigabyte EX58-UD5
Cooling AMD Wraith cooler| Stock Cooler, Copper Core)| Big shairkan B
Memory 2x8GB DDR4 G.Skill Ripjaws 3600MHz| 2x8GB Corsair 3000 | 6x2GB DDR3 1300 Corsair
Video Card(s) ASUS GTX 970 OC in Sli | Inno3d GTX 1050 TI | MSI RX 580 8GB
Storage 250GB Plextor SSD Por 5 /1TB WD Black | 500GB SSD WD, 2x1TB, 1x750 | WD 320/Seagate 320
Display(s) LG 27" 1440P| Samsung 20" S20C300L/DELL 15" | 22" DELL/19"DELL
Case LIAN LI PC-18 | Mini ATX Case (custom) | Atrix C4 9001
Audio Device(s) Onboard | Onbaord | Onboard
Power Supply Silverstone 750 | Silverstone Mini 450W | Corsair CX-750
Mouse Coolermaster Pro | Rapoo V900 | Gigabyte 6850X
Keyboard MAX Keyboard Nighthawk X8 | Creative Fatal1ty eluminx | Some POS Logitech
Software Windows 7 Pro 64 | Windows 7 Pro 64 | Windows 7 Pro 64/Windows 10 Home
Did some people forget that AMD already beat the 9900K clocked at its MCE settings with a lower clocked engineering sample? Just thought id jog the memory of some Intel peeps on here that keep saying BS on this forum.

Zen3 might be what I upgrade next to from my 2700X,, all depends really on price/performance which matters more then just out right performance at like double the price cough cough
 
Joined
Apr 12, 2013
Messages
3,208 (1.23/day)
Yes, obviously all relevant parties have worked on their prototypes for years, in fact most standards are derived from prototypes, and are certainly not created in a vacuum.

But I don't get why so many people are fixated on DDR5, is there really a rush? CPU vendors will make the switch when it's needed and it's ready, not before. And based on the information I've seen, DDR5 doesn't look so good yet in terms of latency.

For mainstream users, dual channel DDR4 2666 MHz is plenty, and if you're a content creator you can always go for a quad-channel HEDT configuration. Memory bandwidth is primarily a bottleneck for heavy server loads, which is why upcoming Ice Lake-SP will feature 8 memory channels per socket. Higher memory bandwidth only really helps if a workload is bottlenecked, and over the past decade memory bandwidth have grown much faster than core speed, so generally you need heavy multithreading to be bottlenecked by memory bandwidth. As of now, DDR4 is supporting up to 3200 MHz, but I don't know if it can be expanded further.


Well, actually not.
The primary motivation was hardware. PPC promised to replace x86, but failed to do so. Apple needed an architecture capable of scaling from a low-power laptop to a high-end workstation, and only x86 could do that.

In terms of software, OS X have no relation to Windows or its ecosystem at all. The Darwin kernel is a mix of the mach microkernel and BSD Unix, and largely rely on the BSD ecosystem, APIs, compilers etc. while they are now increasingly drifting away from that and creating their own walled garden… Most of Apple's current software is already compatible with ARM, so that's not really a big concern.

But ARM is not close to performant enough to replace x86, and this wouldn't change anytime soon. ARM devices like iPhones and iPads rely heavily on specialized instructions to accelerate workloads, and anything else will perform like crap. If Apple switches completely to ARM it would either mean they focus solely on low-performance "web browsing devices" or a hornet's nest of software patchwork and specialized instructions to be "competitive".

But on an interesting note regarding AMD; Zen was supposed to be the steppingstone up to K12, the new big CPU architecture from AMD, based on ARM. Meanwhile K12 is MiA, and probably already obsolete compared to other ARM designs. So Zen 2, 3…5, is it the backup plan after the failure of K12? Is this why we don't hear anything about a Zen successor yet?
Not vanilla ARM cores, but custom Axx - this myth that Intel or even AMD are so far ahead of everybody else will be broken soon enough ~

Apple's nearest competitor in this space is Apple & that was last year, so a 2 year gap with a node shrink is definitely good enough IMO to get them in their own notebooks. From there it's only Apple's ambitions that stops them.
 
Joined
Jun 3, 2010
Messages
968 (0.26/day)
Not vanilla ARM cores, but custom Axx - this myth that Intel or even AMD are so far ahead of everybody else will be broken soon enough ~

Apple's nearest competitor in this space is Apple & that was last year, so a 2 year gap with a node shrink is definitely good enough IMO to get them in their own notebooks. From there it's only Apple's ambitions that stops them.
People ask what it is to love about Qualcomm. In terms of task energy, Apple is their oyster.
 
Joined
Jun 28, 2016
Messages
3,595 (2.50/day)
Most of Apple's current software is already compatible with ARM, so that's not really a big concern.
Oh, that's the misunderstanding. :)
I didn't mean Apple software. They aren't making anything critical at the moment (mostly stuff you need for everyday computing, multimedia etc).
I meant 3rd party software, which is only available on x86.
Apple's shift to Intel CPUs made OS X a feasible alternative for Windows and Linux a decade ago.

If you're from US, you may have different memories. Macs were popular in US also before.
Outside of US they were very rare. Too expensive for home users. Too useless for most professional work.

x86 was an architecture people knew how to use. It was supported by important software. It had all the useful libraries.
Moving from Windows/Linux to Mac suddenly was very easy.
I was studying back then and I remember all the Mac Pro an MacBook boxes lying around.
But ARM is not close to performant enough to replace x86, and this wouldn't change anytime soon.
But it can be frugal. That's the use case at the moment. And the cores are simpler, so you can put more of them in a package. In many ways ARM servers will target the same niche AMD does with EPYC. So if you believe EPYC makes sense, then ARM will as well.
Server scenarios built around complex, single-thread tasks are out of ARM reach and that's unlikely to change. x86 will always have the edge.
Is this why we don't hear anything about a Zen successor yet?
Assuming current Zen will be able to compete with Intel for at least 3 years and AMD may keep selling it for another 5... Zen successor could be a fairly young project.
But who knows? Chiplet idea gives them some interesting possibilities. Can they put an ASIC or FPGA there? Could they open it? Cooperate with clients who could customize the designs to fit their software?

PS5 is going to support RTRT. We don't know how. We know it's a custom Zen. Wouldn't it be nice if Sony could add a custom chiplet that does ray tracing? What if Nvidia made it?
:)
 
Joined
Oct 25, 2018
Messages
264 (0.45/day)
I don't see why Apple doesn't start using AMD in addition to Intel CPU's to expand their available options.
They could use some of the new APU's from AMD. That would cut cost so they could offer a budget entry
level mac book at an affordable price. That would expand their user base greatly. Once a person gets used
to mac os and they know they can get a sub $600 laptop; they will most likely stick with it. People don't
always like change; they want something that works when they need it to...
All the Windows 10 hate lately, AMD getting competitive again in the server and consumer markets plus
the decline in their over priced iPhone markets would seem like an opportune time to take back some
of the consumer desktop/laptop share.

I think that would make more common sense than trying to port x86 to ARM; plus try to develop a CPU that
could come close to Intel, AMD current CPU's....

Everybody knows AMD cpu's can run OSX (or Mac OS) the hackntosh community has been modding
the Mach kernel since the early days of the switch to Intel. 2006-2007ish I believe.

Technically, they could just sell a version of Mac OS to run on PC's also if they wanted to.
sell it for $150-250 and make a lot of money. I think people would pay for that; instead of doing
the all the tricks to get Mac Os to run on a PC hardware...

I think they need to do something other than solely rely on IPhone sales.
That market will continue to decline; I remember when you could get a new phone for
sub $400 in the not to distant past. Now? really $1300? forget that....
I'll stick to my $60 Wal-Mart phone, that runs android and I replace every 1 1/2....
 

bug

Joined
May 22, 2015
Messages
7,561 (4.10/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I don't see why Apple doesn't start using AMD in addition to Intel CPU's to expand their available options.
Apple never sources from several sources. It's one of the ingredients of their recipe to keep costs down that gives them their high margins.
 
Joined
Jun 10, 2014
Messages
2,040 (0.93/day)
Oh, that's the misunderstanding. :)

I didn't mean Apple software. They aren't making anything critical at the moment (mostly stuff you need for everyday computing, multimedia etc).

I meant 3rd party software, which is only available on x86.
Fair enough, but the selection of commonly used productive third party applications for Apple is fairly limited, and since Apple seems to try to get out of the pro market, the selection of "required" applications will probably be very small.

BTW; don't they still own a part of Adobe?

x86 was an architecture people knew how to use. It was supported by important software. It had all the useful libraries.
Sure, but it's worth mentioning that most professional tools available for Macs were already available for PPC, but of course, everyone appreciated ditching PPC. :)

But it can be frugal. That's the use case at the moment. And the cores are simpler, so you can put more of them in a package. In many ways ARM servers will target the same niche AMD does with EPYC. So if you believe EPYC makes sense, then ARM will as well.
Well, if you consider that ARM chips in mobile devices, TVs, Blu-Ray players etc. basically are very weak CPUs with specialized acceleration for anything which needs performance, then it's not really that efficient in generic code. Without the acceleration these CPUs would struggle to open a web browser or play a video.
ARM does fundamentally require more operations to do the same work, so it needs much higher clock speed. Also, your high TDP desktop CPU uses a lot of its die space on a much bigger prefetcher, more SIMD features etc. things which makes a huge difference when using e.g. Photoshop or Premiere.

Epyc certainly makes sense for some workloads, ARM servers doesn't really.

PS5 is going to support RTRT. We don't know how. We know it's a custom Zen. Wouldn't it be nice if Sony could add a custom chiplet that does ray tracing? What if Nvidia made it?
Coming from PR guys, it can probably mean just about anything, so I wouldn't read too much into it before we have real details. Light and well crafted use of raytracing is technically already possible on existing hardware using OpenCL or CUDA, so it all comes down to what they mean by "RTRT"…
 
Joined
Mar 23, 2005
Messages
3,692 (0.66/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 1700X @ stock - (Plus ZEN3 7nm+ Prototype)
Motherboard ASRock Fatal1ty X370 GAMING X AM4
Cooling Corsair H115i PRO RGB, 280mm Radiator, Dual 140mm ML Series PWM Fans
Memory G.Skill TridentZ 32GB (2 x 16GB) DDR4 3200
Video Card(s) Sapphire Radeon RX 580 8GB Nitro+ SE + (RDNA2 7nm+ Prototype)
Storage Corsair Force MP500 480GB M.2 (OS) + Force MP510 480GB M.2 (Steam/Games)
Display(s) Asus 27" (MG278Q) 144Hz WQHD 1440p + 1 x Asus 24" (VG245H) FHD 75Hz 1080p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ ASUS Xonar DGX PCI-E GX2.5 Audio Engine Sound Card
Power Supply Corsair TX750W Power Supply
Mouse Razer DeathAdder PC Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G15 Classic Gaming Keyboard
Software Windows 10 Pro - 64-Bit Edition
Benchmark Scores WHO? I'm the Doctor. The Definition of Gaming is PC Gaming...
ZEN 3 is looking more and more for an upgrade option.
Looking forward to more information in 2020. :)
 
Top