Apple's Homegrown Chips Could Be the End for AMD Graphics in Macs

We may earn a commission from links on this page.
Apple CEO Tim Cook on-stage at last year’s Worldwide Developers Conference.
Apple CEO Tim Cook on-stage at last year’s Worldwide Developers Conference.
Photo: Brittany Hosea-Small/AFP (Getty Images)

Apple’s transition to its own Mac silicon has brought up a slew of questions, and now it seems like the future of the company’s third-party graphics card support in Macs is up for debate.

According to Apple Insider, the company made it clear during a WWDC 2020 developer session—and in a developer support document—that its home-brewed CPUs will support its own GPUs, too. The company already uses its own GPUs in its other devices with ARM-based processors, like iPads and iPhones. If Apple is parting ways with Intel, who’s to say they won’t also with AMD, who makes the GPUs that power the Mac? There’s no indication one way or another, and Apple is staying silent on the matter, so all that’s left to do is speculate.

Advertisement

“Apple Silicon Mac contains an Apple-designed GPU, whereas Intel-based Macs contain GPUs from Intel, AMD and Nvidia,” Gokhan Avkarogullari, Apple’s director of GPU software, said during the WWDC session.

Advertisement

It’s not surprising that that’s the case. Apple confirmed during its WWDC keynote that the company is making its long-rumored move away from Intel processors to its own system-on-chip (SoC) with Apple Silicon, which includes a move to its own integrated GPUs. What’s unclear is what that means for future discrete GPU support. Apple officially stopped supporting Nvidia GPUs when it released macOS Mojave in 2018, but has continued to offer a range of Macs with AMD graphics.

Advertisement

In the near future, AMD discrete GPUs aren’t going anywhere. Apple recently added a new GPU configuration option to its Mac Pro desktop tower, AMD’s Radeon Pro 5500X, and the company said during WWDC that it would be releasing new Intel configurations as well. But one of the things the company pointed out during its developer session is the difference between Apple GPUs and third-party GPUs. Apple GPU architecture is a tile-based deferred renderer (TBDR), and Intel, Nvidia, and AMD are immediate mode renderer GPUs (IMR).

Image for article titled Apple's Homegrown Chips Could Be the End for AMD Graphics in Macs
Screenshot: Apple
Advertisement

TBDR captures the entire scene before it starts to render it, splitting it up into multiple small regions, or tiles, that get processed separately, so it processes information pretty fast and doesn’t require a lot of memory bandwidth. From there, the architecture won’t actually render the scene until it rejects any and all occluded pixels.

On the other hand, IMR does things the opposite way, rendering the entire scene before it decides what pixels need to be thrown out. As you probably guessed, this method is inefficient, yet it’s how modern discrete GPUs operate, and they need a lot of bandwidth to do so.

Advertisement

For Apple Silicon ARM architecture, TBDR is a much better match because its focus is on speed and lower power consumption—not to mention the GPU is on the same chip as the CPU, hence the term SoC. This is probably why Apple wrote, “Don’t assume a discrete GPU means better performance,” in its developer support document. It’s all that dang bandwidth it doesn’t need.

It could also be a reason why that the Shadow of the Tomb Raider demo (running on Rosetta 2) Apple showed off during its keynote looked so good. I’m no game designer, but if Apple if helping developers port their games to not only its ARM architecture, but its GPU architecture, it just might grow some more teeth in the gaming sphere. And if that happens, Macs might actually become competitive gaming machines once you start to compare benchmarks.

Advertisement

I’d still be highly skeptical of the cost of Apple’s future machines, though, especially since you can currently build or buy a PC with better specs for much less than a Mac. There’s also something to be said about the DIY culture baked into the Windows-based PC market. Apple has generally made its customers rely totally on the company to fix hardware-related issued or upgrade, and if it wants to attract more developers to code their games for its hardware and macOS, understanding the PC gaming culture would go a long way. For some, it might not matter if Apple’s GPUs are technically better.

Like Intel, AMD will stick it out with Apple for as long as it can, until Apple is positive it can survive without any third-party hardware components. Then the walled garden will be fully grown.

Advertisement

We reached out to Apple for comment on its future AMD GPU plans, but have yet to receive a response. We will update if/when we hear back.

Advertisement