Here are Apple’s WWDC25 Developer Sessions on the Foundation Models Framework

Discover WWDC25’s Machine Learning & AI sessions for developers—Foundation Models, MLX, Vision, and SpeechAnalyzer, all focused on private AI

At WWDC25, Apple expanded access to their Foundation Models to third-party developers, making intelligence features easier to implement while maintaining privacy.

With the framework, developers are able to access local, on-device models from Apple, make requests to Private Cloud Compute when needed, and can readily adopt tools like the Vision framework or SpeechAnalyzer.

In introducing these capabilities, Apple has produced the following Machine Learning & AI sessions:

Apple Developer sessions on Machine Learning & AI from WWDC2025

Intro

Foundation Models

MLX

Features

More

Explore all the Machine Learning & AI sessions from WWDC25, plus check out my recommended viewing order for the App Intents sessions.


P.S. Here’s the full list of sessions, no sections – copy these into your notes:

List of Apple Developer sessions on Machine Learning & AI from WWDC2025

Posts You Might Like

If Apple wants its headset to win, it needs to reinvent the app »
David Pierce of The Verge takes a look at Apple's approach to apps in lieu of a rumored headset – and how the ecosystem needs to change.
Mark Gurman on TBPN: How Siri Will Be Powered By Google’s Gemini »
Bloomberg editor Mark Gurman stopped by TBPN to discuss Apple Intelligence, Siri updates, and the iPhone 17.
Apple adds Shortcuts for Mac support to Pages, Numbers, Keynote
6 Years Later: ‘Worth A Look. Something For Everyone.’
Reminiscing about the time Tim Cook quote tweeted my story about apps that work with Shortcuts and Siri.