Tuesday, July 7th 2020
Apple to Develop the Metal Family of GPUs, Dump AMD Radeon
In the next big step toward complete silicon independence, Apple is planning to dump AMD as a supplier of discrete GPUs in the near future, closely following its decision to dump Intel and the x86 machine architecture in favor of its own SoCs based on the Arm machine architecture. The company is developing its own line of discrete GPUs under the "Metal GPU Family," a name borrowed from its own Metal graphics API.
This explosive bit of information comes from a WWDC 2020 presentation slide posted by Longhorn (@never_released) on Twitter. The slide suggests that along with the processor, Apple is making a clean break with its graphics hardware. The SoCs powering client-segment Macs, such as future iMacs or MacBooks, could feature iGPUs based on this graphics architecture, while larger platforms such as MacBook Pros, Mac Pros, and iMac Pros of the future could feature Apple's own discrete GPUs.
Source:
Longhorn (Twitter)
This explosive bit of information comes from a WWDC 2020 presentation slide posted by Longhorn (@never_released) on Twitter. The slide suggests that along with the processor, Apple is making a clean break with its graphics hardware. The SoCs powering client-segment Macs, such as future iMacs or MacBooks, could feature iGPUs based on this graphics architecture, while larger platforms such as MacBook Pros, Mac Pros, and iMac Pros of the future could feature Apple's own discrete GPUs.
67 Comments on Apple to Develop the Metal Family of GPUs, Dump AMD Radeon
Don't get me wrong, I have seen a few knowledgeable Geek Squad guys before, but I've seen a lot that are so ignorant when it comes to hardware it makes you wonder how they got their job. I was shopping around for a laptop a couple years back for the daughter - something she could use to do some light gaming, used for school work, skype with grandparents and her aunt/uncle. I wasn't looking to spend a lot, but something that would work for her for 4 or 5 years until she got to high school. As I was browsing laptops at Best Buy I overhear a Geek Squad guy talking with an older lady (probably around 60) about the laptops just a few feet away. The lady asked what's the difference between having 2 cores and 4 cores. She saw the 2 core was clocked higher (2.3 or something like that) and the 4 core was only 2.0. The Geek Squad guy literally told her that 2 cores at 2.3 means she'd have a total speed of 4.6Ghz. Whereas the 4 core at 2.0 would mean she'd have 8GHz which is why the 4 core was better. I started laughing loudly and they both looked at me. I looked straight at the Geek Squad guy as I told the lady this guy was a moron and she should go elsewhere to find a laptop, I suggested Microcenter if she was in need of getting proper answers to her questions.
The wife likes Macs - I guess its the only thing she's used since she started her graphic design classes in college. She likes the Apple store if she needs to find a new laptop (thankfully that's only every 6-8 years because these things are outrageously overpriced and she takes care of her laptops) so she can ask questions from people that strictly deal with Apple products. She also likes the fact that the company she works for will give her thousands of dollars of software to install on her laptop so she can use it to work from home if needed and they don't care if she uses it to freelance.
While the most negative thing of the computer hardware, everywhere and in general, is that it's slow and annoying, buggy and disappointing.
The fact is the vast majority of people choose their OS largely based on the applications they use, not based on what OS performs the "best" & this applies especially to Mac owners.
I choose a free & paid application just to show the fallacy in your argument.
On the other hand, I know of some marketing and design shops that made the jump to Windows systems after the early 2010s. This was back when the “trash can” Mac Pro offered limited expansion / was mostly a downgrade from the 2008-2012 models and the 17” MacBook Pro was axed around 2011. Apple hasn’t been friendly to the pro market for years in favor of general consumers and those who see a Mac more as a fashion/prestige item. So it wouldn’t surprise me if the pro market has already eroded to a point where Apple is ready to throw in the towel on these users. I’m certain Final Cut, the Adobe products and other big names will be ported over in some form. But the ecosystem of smaller vendor’s pro products and those selling addons/plugins will likely go away.
Think of it like Valve/Steam vs Windows Store. Valve was "willing" to create SteamOS (Linux distro) to compete against Windows Store. When Microsoft dropped Windows Store requirements, SteamOS died. As such, it became clear (in hindsight) that SteamOS was purely a negotiating tool.
Apple previously could only switch between NVidia and AMD. Intel GPUs are around the corner, but providing a 4th option... even if it isn't necessarily going to be used... would definitely lower prices from Apple's perspective.
Also, this post seems misleading. We don’t really have context for the slide, even from the tweet, and we know Apple is going to continue to build and support Intel Macs for some years ahead.
Couldn’t this mean the second column in the slide indicates that they’re going to continue to use other vendors with their Intel based Macs? How many years will it be until they can beat Navi in their flagship product?
Edit: Also worth noting that Apple has opened up development of system and kernel extensions (drivers) for the first time in this new release. I wonder what this means for add-in cards? Maybe nVidia can finally release some drivers :p
That's an expensive naught.
So we're finally going to see four discrete GPU, be interesting no doubt.