Apple is shifting into the warm arms of AI, making a series of acquisitions of AI companies, hiring staff, and hardware updates that will bake AI into the next generation of iPhones.
Apple is reportedly using on-device AI capabilities inside of its next-gen iPhones rather than relying on the cloud. Apple has been buying AI startups left and right, acquiring 21 companies since 2017. The most recent AI acquisition had Apple purchasing California-based start-up WaveOne, which works on AI-powered video compression, acquiring the company back in early 2023.
The company has been careful with generative AI chatbots and image creators like OpenAI, Microsoft, Google, and Adobe have been doing so far. Apple has reportedly been spending millions per day on multiple AI projects, including AI-powered text, voice, and image tools.
Wedbush Securities analyst Daniel Ives told The Financial Times that he would be surprised if Apple didn't have a huge AI-related deal this year, with Morgan Stanley is reporting that close to half of Apple's new AI job postings mention deep learning, which shows Apple is serious about AI (and you'd be silly if you didn't think Apple of all companies, wasn't serious about the future of AI, especially in it's most popular-selling device: the iPhone).
Morgan Stanely expects Apple's future iOS 18 unveiling event at the Worldwide Developers Conference in June 2024 to have a heavy focus on AI, with Apple's in-house functional LLM -- dubbed "Ajax GPT" could end up being the engine behind an improved version of Siri, which would be very welcome for the iPhone.
Apple has its upcoming M3 and A17 Pro in-house chips that would benefit from an AI chip, similar to the NPU (Neural Processing Unit) inside of Intel's new Core Ultra "Meteor Lake" processors inside of new laptops, while Microsoft is all-systems-go with AI and the AI PC in 2024, as well as an AI-powered version of Windows coming in the form of Windows 12.
Apple could process LLMs on-device with future iPhone models, something teased in a research paper published by Apple last month. In this research paper, Apple proposed using flash memory for on-board LLM processing. Google and Samsung recently unveiled new smartphones that use onboard AI hardware for image editing, real-time text translation, and other tasks.
Apple's future-gen iPhone will surely have onboard AI power, processing LLM on the device, and not using the cloud will introduce a new set of problems. Overheating? Increased battery consumption? Slowing down other parts of the iPhone? We'll find out later this year, I guess.