New developer APIs hint at ‘Personal Context’ for Apple Intelligence coming in iOS 18.4

New developer APIs in iOS 18.4 hint at the coming Personal Context feature – now apps can tell Siri what she's looking at.

The first look at Personal Context for Apple Intelligence is here as APIs available in the iOS 18.4 developer betas allow apps to further their content for the system to understand. This sets the stage for the most significant update to Siri so far, where all your apps can provide Siri with the available views and content to work with – in a secure and private manner, too.

As first mentioned by Prathamesh Kowarkar on Mastodon, there is now a suite of APIs in beta that associate an app’s unique content, called an entity, with a specific view – this allows Siri to read what’s indexed on-screen and use it with other app’s actions when triggered by a command.

APIs like this are necessary for the coming Siri update to actually do what Apple says Apple Intelligence is capable of – now that the functionality is here, however, it’s up to developers to implement everything to make sure the experience works well.

Here are the new pages:

If these APIs are in beta now, it stands to reason they’ll leave beta after iOS 18.4 releases in full – which means Personal Context might be coming as early as iOS 18.4.

Check out the post from Kowarkar on Mastodon.

 

Posts You Might Like

21 Movies to Watch in 3D on Apple Vision Pro
While official "3D" labels have been removed, searching the TV app shows the top films released in 3D that are likely coming to Apple Vision Pro – here's the list.
This Tuesday: My Shortcuts User Group Meetup at WWDC »
Join me and developer friends near WWDC this Tuesday - there’s good gear, good food, and good times to be had!
Move Over Prime Day, It’s Time For Indie Dev Sales On Great Apps
On a day when sales are abound, indie developers are putting their apps on sale – check out the full list and get some great discounts.
Apple Vision Pro Testing: Streaming from Mac
Using the developer strap, I figured out how to capture footage from the Apple Vision Pro in real-time – here's my 1.5+ hour-long livestream.