New developer APIs hint at ‘Personal Context’ for Apple Intelligence coming in iOS 18.4

New developer APIs in iOS 18.4 hint at the coming Personal Context feature – now apps can tell Siri what she's looking at.

The first look at Personal Context for Apple Intelligence is here as APIs available in the iOS 18.4 developer betas allow apps to further their content for the system to understand. This sets the stage for the most significant update to Siri so far, where all your apps can provide Siri with the available views and content to work with – in a secure and private manner, too.

As first mentioned by Prathamesh Kowarkar on Mastodon, there is now a suite of APIs in beta that associate an app’s unique content, called an entity, with a specific view – this allows Siri to read what’s indexed on-screen and use it with other app’s actions when triggered by a command.

APIs like this are necessary for the coming Siri update to actually do what Apple says Apple Intelligence is capable of – now that the functionality is here, however, it’s up to developers to implement everything to make sure the experience works well.

Here are the new pages:

If these APIs are in beta now, it stands to reason they’ll leave beta after iOS 18.4 releases in full – which means Personal Context might be coming as early as iOS 18.4.

Check out the post from Kowarkar on Mastodon.

Update: Nope – it’s officially delayed.

Posts You Might Like

Upcoming Speaking Appearance – iOSDevHappyHour: May 2024!
Join me on Saturday, May 18 as I give a short talk to iOS Dev Happy Hour about App Intents!
Apple posts Shortcuts release notes for iOS 16, 16.1, and 16.2
Apple has posted a support article covering what's new in Shortcuts in iOS 16.2, iOS 16.1, and iOS 16 (plus watchOS and macOS) — I cover the main bullets from each release.
Apple Developer Videos Now on YouTube
Apple has a new @AppleDeveloper account on YouTube where they're hosting videos from last year's and this WWDC – which means they can be automated via Shortcuts...
Apple Vision Pro Testing: Streaming from Mac
Using the developer strap, I figured out how to capture footage from the Apple Vision Pro in real-time – here's my 1.5+ hour-long livestream.