- At WWDC on Monday, Apple subtly touted how much it’s doing to cutting-edge artificial intelligence and machine learning.
- Unlike most tech companies that develop AI, Apple does advanced processing on its own devices rather than relying on the cloud.
- Instead of talking about AI models and technologies, Apple’s focus on its products usually means that it just shows new features that are quietly enabled by AI behind the scenes.
Apple Park will be seen ahead of the Worldwide Developer Conference (WWDC) in Cupertino, California on June 5, 2023.
Josh Edelson | AFP | Getty Images
At Apple’s annual developer conference, WWDC, on Monday, the company subtly touted how much it’s working on cutting-edge artificial intelligence and machine learning.
Apple seemed to sit on the sidelines as startups like Microsoft, Google and OpenAI adopted cutting-edge machine learning technologies like chatbots and generative AI.
But on Monday, Apple announced several key AI features, including an improved iPhone AutoCorrect based on a machine learning program using the Transformers language model, the same technology that powers ChatGPT. Apple says it can also learn and improve from your text messages and input methods.
“The moment you want to type the word ducking, your keyboard will learn it too,” said Craig Federighi, Apple’s head of software, who said AutoCorrect would say “ducking” instead of the common word. I joked about my tendency to use meaningless words. abusive.
Monday’s biggest news was the Vision Pro, a flashy new augmented reality headset, but it still showed how Apple is committed and focused on developing cutting-edge machine learning and artificial intelligence. OpenAI’s ChatGPT may have gotten him over 100 million users in two months when it was released last year, but now Apple is using the technology to increase the number of people who own 1 billion of his iPhones. We are improving the features that people use every day.
Unlike rivals who build massive models using server farms, supercomputers and terabytes of data, Apple wants to have AI models on their devices. The new autocorrect feature is especially impressive because it runs on the iPhone, while models like ChatGPT require hundreds of expensive GPUs to work together.
On-device AI avoids many of the data privacy issues faced by cloud-based AI. Once your model can run on your phone, Apple will collect less data to run your model.
It is also closely tied to controlling the hardware stack, all the way down to Apple’s silicon chips. Every year, Apple puts new AI circuits and GPUs into its chips, giving it control over the architecture to adapt to changes and new technologies.
Apple doesn’t like to talk about “artificial intelligence”. They prefer the more academic term “machine learning” or simply talk about the capabilities the technology enables.
Some of the other big AI companies have leaders who come from academic backgrounds. So the emphasis has shifted to showing my work, explaining how it could be improved in the future, and documenting it so others can study and build on it.
Apple is a product company and has been relentlessly secretive for decades. Instead of talking about a specific AI model or training data or how it could be improved in the future, Apple simply mentions its capabilities and says it has great technology at work behind the scenes. .
An example of this, announced Monday, was an improvement to the AirPods Pro that automatically turns off noise cancellation when users are talking. Apple doesn’t position this as a machine learning feature, but it’s a hard problem to solve and the solution is based on AI models.
In one of the boldest features announced Monday, Apple’s new Digital Persona feature creates 3D scans of your face and body so you can put on a Vision Pro headset and video conference with other people. While doing so, you can virtually reproduce its appearance.
Apple also mentioned several other new features that leverage the company’s neural network technology, such as the ability to identify fields to fill out in PDFs.
One of the biggest cheers of the afternoon in Cupertino was for the iPhone’s machine learning feature that can identify your pet (compared to other cats and dogs) and save photos of all your pets in a folder. .