Breaking News

Apple’s Core ML now lets app developers update AI models on the fly

Apple today introduced upgrades for its Core ML machine learning framework, including model encryption using Xcode and Core ML Model Deployment, a way to store and launch models and update AI independent of the app update cycle. AI within apps can power a range of features from classification of natural language or images to analysis of speech, sounds, and other media.

“In the past, you would have to push more app updates just to get the newer models in your user’s hands. Now with model deployment, you can quickly and easily update your models without updating the app itself,” Apple engineer Anil Katti said in a WWDC session.

Core ML Model Deployment also gives developers a way to group models into collections and offers targeted deployment for machine learning customized for operating system, device, region, app version, and other variables. Developers must enable Core ML Model Deployment in the Core ML API.

Also new in machine learning for Apple developers using the Create ML Mac app for training AI are templates for action classification for videos and style transfer for images and video. The action classification template uses pose estimation to track movement of the human body and can power apps for exercise or dance training. Developers can train a Create ML action classification template using videos with one human subject and one action type at a time.

VB Transform 2020 Online – July 15-17. Join leading AI executives: Register for the free livestream.

Apple first introduced Core ML in 2017. Like Google’s ML Kit, it is made to help developers quickly add AI-powered features for apps like natural language processing or computer vision.

Core ML updates follow a keynote address Monday by Apple executives as part of its Worldwide Developers Conference (WWDC). In the prerecorded address, Apple announced plans to transition to ARM CPUs for personal computers by the end of year and showed off updates to iPadOS, watchOS, macOS, and Apple’s smart home offering.

iOS 14 will come with new features like continuous picture-in-picture video, the App Library for automatically organizing apps, and the ability to add widgets to the iOS home screen. Siri is also getting a user interface update that lets results show up inside other apps or on the home screen so you can see the latest reminders, weather, or search results. iOS 14 will also come with a new Translate app that uses on-device machine learning for translating conversations between people who speak different languages, starting with languages such as English, Italian, Japanese, and Mandarin Chinese. It will also include App Clips, a way to quickly launch apps using things found in the real world like NFC or visual QR codes. Each App Clip must be less than 10MB in size.

Last year, Core ML 3 also got on-device machine learning training for personalization and privacy. On-device machine learning improvements introduced for Apple users on Monday include better handwriting recognition for iPads and dictation of speech-to-text on iPhones.

Apple WWDC 2020: read all our coverage here.