Everything you need to know about Apple’s AI chip

The AI your phone craves.
The AI your phone craves.
Image: Apple/screenshot
We may earn a commission from links on this page.

Artificial intelligence is becoming a defining characteristic in the smartphone market, powering personalization, virtual assistants, and even battery life.

But AI takes a lot of computing power. To make up for that, companies like Apple and Huawei are adding additional chips into smartphones to handle such tasks. These are complementary to the existing CPU and GPU chips already in phones, and configured to be faster for one specific purpose—AI—at the expense of being able to do anything else. They also keep AI tasks from draining phone batteries as fast. Apple has dubbed theirs the Neural Engine, located inside the A11 Bionic chip, while Huawei’s is called the Kirin 970.

Apple’s Neural Engine will process tasks like its new FaceID facial recognition, understanding voice commands for Siri, and image-processing.

This isn’t just a smartphone trend either. Google built an AI chip called a Tensor Processing Unit that will soon be available to users of its cloud business. Microsoft is developing similar cloud technology and also has a custom chip in its HoloLens. Last fall, Intel acquired a company called Movidius to build AI chips that are focused on image processing.

Not everyone believes AI-specific chips are the future. Graphics processing unit manufacturer Nvidia found in the mid-2000s that its GPUs greatly accelerated deep-learning algorithms, which held great promise in AI research. The company invested in supporting running those algorithms on its GPUs, and has seen a massive explosion in revenue since the beginning of the AI boom in 2012. GPUs are less power-efficient than specialty chips—that matters for data centers and cell phones—but they have the added bonus of being able to handle complex graphics like virtual reality, augmented reality, and gaming. (It’s also worth noting that Nvidia doesn’t have much of a presence in mobile GPUs. That space is owned by Qualcomm, which makes the mobile GPU in flagship phones like the Samsung S8, HTC U11, LG V30, and Google Pixel.)

Apple, like Google, has staked much of its future on increasing the intelligence of its devices. Although the company introduced the first mainstream virtual personal assistant, Siri, in 2011, it has seemed from the outside that Apple lags in AI research and development.

Apple stresses that this isn’t the case. It recently ramped up the visibility of its AI research and implementation, last year hiring veteran AI researcher Ruslan Salakhudinov from Carnegie Mellon to run its research efforts. The company also began publishing research papers (in contrast to its typical silence on internal R&D), and started a blog to further communicate its pursuits.

During its June WWDC developer conference, Apple introduced Core ML, a framework for building artificial intelligence algorithms into apps for Apple products. Since then, Google and Amazon have built converters to make their AI compatible with Apple’s new format. The company has also implemented AI in its phones, tablets, and computers, and enhancing Siri’s capabilities.