Here are Apple’s WWDC25 Developer Sessions on the Foundation Models Framework

Discover WWDC25’s Machine Learning & AI sessions for developers—Foundation Models, MLX, Vision, and SpeechAnalyzer, all focused on private AI

At WWDC25, Apple expanded access to their Foundation Models to third-party developers, making intelligence features easier to implement while maintaining privacy.

With the framework, developers are able to access local, on-device models from Apple, make requests to Private Cloud Compute when needed, and can readily adopt tools like the Vision framework or SpeechAnalyzer.

In introducing these capabilities, Apple has produced the following Machine Learning & AI sessions:

Apple Developer sessions on Machine Learning & AI from WWDC2025

Intro

Foundation Models

MLX

Features

More

Explore all the Machine Learning & AI sessions from WWDC25, plus check out my recommended viewing order for the App Intents sessions.


P.S. Here’s the full list of sessions, no sections – copy these into your notes:

List of Apple Developer sessions on Machine Learning & AI from WWDC2025

Posts You Might Like

Siri to gain deep Shortcuts integration in iOS 18, Apple spending millions per day on conversational AI »
According to a report from The Information, Apple is investing heavily in some combination of language models and Shortcuts to augmenet Siri's capabilities – read the summary from 9to5Mac.
Apple’s Shortcuts needs more first-party app support
The Talk Show Live From WWDC 2024 »
Check out the discussion between John Gruber and Apple executives about everything just announced at WWDC.
Dual Capture and Landscape Selfies for iPhone Need Shortcuts Support
The new iPhone 17 models allow for Dual Capture and Selfie Videos – but the Camera app's Shortcuts support lags behind.