Categories
Developer Links

Ideally, Apple Intelligence Could Query Your Personal Context »

Jason Snell on the Upgrade podcast:

It’s the idea that there’s a personal data trove that you have. And then you’re asking the model to query the contents. […] You know about all this stuff, now do stuff with it.

And if you can do that on device, it’s really powerful. [I]t’s hard to do that in the cloud because you would actually need to upload it to Private Cloud Compute. And that’s a lot of data. So what you want is some parsing to happen on device.

But that’s the dream, right? Is that your phone knows about your stuff and then the LLM that’s running can make uh distinctions based on your stuff.

And ideally the model could potentially, and I know this is wild, I don’t think they’ve talked about it, but ideally the model could query your personal context, get a bunch of related data out, and then send that in a query to the private cloud and have it process it, right? You could have a kind of cherry picking your personal data model on device that then kicks it to the more powerful model to process it and intuit things about it.

There’s lots of ways you could do this. It’s a great idea. It was a great idea in 2024 when they showed it, but they got to do it – is the challenge there.

In reply to Jason, cohost Myke Hurley said the following:

So, I’m just going to make a little prediction. These[…] things that I’ve spoken about, we will see iOS 27 before [they] ship.

I believe they will have stuff – like, I believe they will have [releases] in the spring like it has been rumored, but I don’t think all of these things.

I think particularly the Personal Context thing… we may never see that.

For what it’s worth Apple has done this and named it Queries. Shortcuts users might better understand this as the Find actions, which allow actions to find and filter data from apps before using it in their shortcuts.

Introduced for developers alongside the App Intents API in 2022, Queries are how intents/actions retrieve entities/data from apps. In their most recent session “Get to know App Intents” from 2025, they explicitly say the following – a phrase that caught my attention in regards to the “new Siri” we’ve been waiting for:

Queries are the way the system can reason about my entities

Apple has also been building out their ability to index and query these entities through their Spotlight support, as well as now Visual Intelligence.

You can learn more about Entity Queries & Indexed Entities, and watch the developer sessions for Get to Know App Intents & Explore new advances in App Intents.

Check out Upgrade #588, follow the show on Apple Podcasts, or watch the video on YouTube.

Categories
Developer Links

How to integrate your app with Visual Intelligence »

From the Apple Developer documentation (break added):

With visual intelligence, people can visually search for information and content that matches their surroundings, or an onscreen object.

Integrating your app with visual intelligence allows people to view your matching content quickly and launch your app for more detailed information or additional search results, giving it additional visibility.

And:

To integrate your app with visual intelligence, the Visual Intelligence framework provides information about objects it detects in the visual intelligence camera or a screenshot. To exchange information with your app, the system uses the App Intents framework and its concepts of app intents and app entities.

When a person performs visual search on the visual intelligence camera or a screenshot, the system forwards the information captured to an App Intents query you implement. In your query code, search your app’s content for matching items, and return them to visual intelligence as app entities. Visual intelligence then uses the app entities to display your content in the search results view, right where a person needs it.

To learn more about a displayed item, someone can tap it to open the item in your app and view information and functionality. For example, an app that allows people to view information about landmarks might show detailed information like hours, a map, or community reviews for the item a person taps in visual search.

Browse the full documentation from the Apple Developer site and learn how to use Visual Intelligence for iPhone.

 

Categories
Developer Links News

Apple’s Foundation Models Framework Unlocks New App Experiences Powered by Apple Intelligence »

From Apple Newsroom:

With the release of iOS 26, iPadOS 26, and macOS 26 this month, developers around the world are able to bring even more intelligent experiences right into their apps by tapping into the on-device large language model at the core of Apple Intelligence.

The Foundation Models framework allows developers to create new intelligence features that protect users’ privacy and are available offline, all while using AI inference that is free of cost.

You can now add intelligence to your apps for free on Apple platforms – and while it’s relatively simple today… that’s only for now.

View the full article.

Categories
Developer

New App Intents and Apple Intelligence Consulting Availability

App Intents are how Apple devices understand and interact with your app. They’re the foundation of features of like Shortcuts, Siri, Spotlight – and now Apple Intelligence. If you want your app to take advantage of the deepest parts of the Apple ecosystem, it starts with App Intents.

Following recent projects with Foodnoms, MindNode, and Tripsy, I now have availability for App Intents consulting this fall and into 2026. In addition to full start-to-finish projects, I’m introducing new flexible options:

  • Audits: a focused review of your existing intents, data models, and opportunities
  • Docs-only: structured documentation you can use with your team to implement directly

If you want Apple Intelligence to understand your app’s core features, or you want to deploy your app across the system to make a cohesive experience, I can help you design and deliver the following:

  • The unreleased Actions and Context portions of Apple Intelligence
  • App Intents, App Entities, and App Enums for your app
  • Automatically-generated instances of important intents as App Shortcuts
  • Spotlight, Siri, and Controls integrations
  • Custom Shortcuts to be distributed to users
  • Documentation on the new offerings
  • Share ongoing updates in a developer newsletter

Each engagement starts with a free, 1-hour call to asses your needs, discuss budgets and rates, and outline next steps – whether you’re working on a brand, part of a team, or an indie developer, we can find a solution that works for you.

You can learn more about my services, explore past client work, and watch my conference talks on my Consulting. If you’re ready to move forward, book a call with me directly to get started.

Let’s make your app one of the best citizens of the Apple ecosystem – ready for Apple Intelligence, Shortcuts, and beyond.

Categories
Developer

Here are Apple’s WWDC25 Developer Sessions on the Foundation Models Framework

At WWDC25, Apple expanded access to their Foundation Models to third-party developers, making intelligence features easier to implement while maintaining privacy.

With the framework, developers are able to access local, on-device models from Apple, make requests to Private Cloud Compute when needed, and can readily adopt tools like the Vision framework or SpeechAnalyzer.

In introducing these capabilities, Apple has produced the following Machine Learning & AI sessions:

Apple Developer sessions on Machine Learning & AI from WWDC2025

Intro

Foundation Models

MLX

Features

More

Explore all the Machine Learning & AI sessions from WWDC25, plus check out my recommended viewing order for the App Intents sessions.


P.S. Here’s the full list of sessions, no sections – copy these into your notes:

List of Apple Developer sessions on Machine Learning & AI from WWDC2025

Categories
Developer

Watch the WWDC2025 App Intents Developer Sessions In This Order

After announcing updates at WWDC, Apple released four new developer sessions directly related to App Intents—the API that lets Apple Intelligence understand and interact with apps—following up on sessions from years past.

Here are this year’s sessions – in my recommended viewing order:

  1. Get to Know App Intents (24:36)
  2. Explore new advances in App Intents (26:49)
  3. Develop for Shortcuts and Spotlight with App Intents (18:56)
  4. Design Interactive Snippets (7:28)

Start with the summary of the API, see what’s new this year, learn the most relevant ways users will interact with your app, and then take a look at advances in snippets – in 1 1/2 hours of focused viewing.

Enjoy – there’s lots to learn!

Check out all the Machine Learning & AI videos from WWC25 from Apple, plus check out my curated list of the Foundation Models framework sessions.

Categories
Developer Links News

Apple Supercharges Its Tools and Technologies for Developers to Foster Creativity, Innovation, and Design »

From Apple’s announcements at WWDC:

App Intents lets developers deeply integrate their app’s actions and content with system experiences across platforms, including Siri, Spotlight, widgets, controls, and more.

This year, App Intents gains support for visual intelligence. This enables apps to provide visual search results within the visual intelligence experience, allowing users to go directly into the app from those results. For instance, Etsy is leveraging visual intelligence to enhance the user experience in its iOS app by facilitating faster and more intuitive discovery of goods and products.

“At Etsy, our job is to seamlessly connect shoppers with creative entrepreneurs around the world who offer extraordinary items — many of which are hard to describe. The ability to meet shoppers right on their iPhone with visual intelligence is a meaningful unlock, and makes it easier than ever for buyers to quickly discover exactly what they’re looking for while directly supporting small businesses,” said Etsy CTO Rafe Colburn.

Read the full post from the Apple Newsroom.

Categories
Developer Livestreams Offsite

AI + iOS: The State of Apple Development Ahead of WWDC (feat. Rudrank Riyam)

From my stream with Rudrank Riyam on YouTube Live — tune in:

This week, developer Rudrank Riyam, author of the AiOS Dispatch newsletter, joins me to talk about his experiences developing for Apple platforms using AI-assisted coding, and what we’re interested in ahead of Apple’s Worldwide Developer Conference (WWDC).

Subscribe to the newsletter here: https://aiosdispatch.com

View the stream live or catch the replay on YouTube.

Categories
Developer Livestreams Offsite

Apple Intelligence: Action Centered Design Framework (feat. Vidit Bhargava)

From my stream with Vidit Bhargava on YouTube Live — tune in:

This week, designer and developer Vidit Bhargava joins me to talk about his framework for app development centered around designing actions first, particularly as it relates to Apple Intelligence.

Read about the framework here: https://blog.viditb.com/action-centered-design-framework-talk/

Chapters (generated with Descript):

00:00 Introduction and Guest Welcome

02:32 Guest Background and App Development

03:45 Evolution of App Design

05:56 Action Centered Design Framework

08:46 Designing for Multiple Platforms

13:21 App Intents and Practical Examples

16:52 Future of App Design and AI Integration

48:14 Demo and Practical Applications

57:06 Exploring App Intents and Practicality

57:27 Challenges of Mobile AI Implementation

58:12 Battery Life and AI Advancements

58:53 Apple’s Approach to AI and Actions

01:01:35 The Future of Shortcuts and Automation

01:03:38 Innovative UI and Interaction Design

01:08:53 Custom Interactions and Maintenance

01:10:27 Generative Coding and Platform Variability

01:15:17 AI and App Intents in Real-World Applications

01:33:59 Economic Models for AI-Driven Apps

01:41:20 Concluding Thoughts and Future Prospects

View the stream live or catch the replay on YouTube.

Categories
Announcements Developer

Announcing My WWDC Meetup: Apple Intelligence Automators at CommunityKit

Hello friends! It’s my pleasure to announce my second-annual WWDC meetup 1, this time as part of the free CommunityKit conference under the name “Apple Intelligence Automators” – sign up here for the free event on Tuesday, June 10 from 2:00 PM – 4:00 PM.

Located inside the Hyatt House San Jose / Cupertino at 10380 Perimeter Rd in Cupertino (just a few minutes from Apple Park), we’ll be discussing the announcements from the WWDC keynote address and State of the Union from the day prior as it relates to Apple Intelligence, App Intents, and Shortcuts.

With Apple Intelligence being the focus of last year’s WWDC, and delays on those features pushing things back, we should have plenty to talk about.

Check out the event page on Luma to register and don’t forget to get your free ticket to CommunityKit.

  1. I hosted a Shortcuts meetup last year – and had a blast.
Categories
Developer

You Should Watch The Apple Intelligence Developer Sessions In This Order

(Editor’s note: updated June 2025 to include sessions from WWDC25)

If you’re getting into development for Apple Intelligence, it can be hard to understand how to parse Apple’s documentation. App Intents, the API that powers the Actions and Personal Context features of Apple Intelligence, has been shipping since 2021, with a deeper history since the introduction of Shortcuts in 2018 – there are over 30 sessions to learn from.

Since I’ve been consulting with developers on their App Intents integrations, I’ve developed a Star Wars Machete Order-style guide for the Apple Intelligence developer sessions – watch the sessions in this order to best understand how Apple thinks about these APIs.

Apple Intelligence Machete Order

How to understand the App Intents framework

Start with the latest sessions from 2024, which reintroduces App Intents as it extends across the system in more ways, as well as updates the Design suggestions from their earlier framework:

Getting Deeper into App Intents

From there, once you have the context of how App Intents can be deployed, start back at the beginning to see how to implement App Intents, then take a look at where they are heading with Snippets:

Importance of App Shortcuts

Built on top of App Intents, App Shortcuts are automatically generated for the most-important tasks and content that show up in Spotlight and Siri – and often the most-common way users interact with the App Intents framework:

Apple Intelligence sessions

Finally, once you understand the core of App Intents, what it used to be vs. what Apple wants you to do now, and how to deploy App Intents across Spotlight and Siri, move onto the latest updates for Apple Intelligence – new features that enable Personal Context, as well as integrating your intents into domains for Siri:

Good to know

Beyond that, it can be helpful to review earlier sessions to understand where Apple is coming from, as well learning about the lesser-known experience your app is capable of providing:

All the Apple Intelligence developer sessions

For good measure, here’s the full list of the Shortcuts / App Intents / Apple Intelligence developer sessions:

Check out more Machine Learning and AI videos from the Apple Developer site, read the full App Intents documentation, and learn more about Apple Intelligence.

P.S. You can hire to design your App Intents integration.

 

Categories
Developer Siri Shortcuts

How Apple Will Win the AI Race: My Talk on App Intents & Apple Intelligence

Last Tuesday, I gave a talk to over 300 developers at Deep Dish Swift about Apple Intelligence, where I made the following claim:

Apple will win the AI race

I’m an expert on App Intents, the API that powers the yet-to-be-seen features of Apple Intelligence – Actions and Personal Context. After designing implementations with my clients, and seeing the trends around AI-assisted coding, hearing rumors of an iOS 19 redesign, and seeing the acceleration effects of artificial intelligence, I believe Apple is skating to where the puck will be, rather than where it is now.

I’ll leave the thesis for the talk – but if you’re building for any Apple devices, you’ll want to understand how important App Intents is to the future of the platform:

Watch the 54-minute talk from Deep Dish Swift on YouTube Live.

Categories
Announcements Developer

No Ticket to WWDC? Come to CommunityKit, the New, Free Alt Conf

If you’re interested in “going” to WWDC, but don’t have a developer ticket – you should sign up for CommunityKit, the alternative conference1 for Apple developers, media, and fans.

From June 9 though 11, join us at the Hyatt House Cupertino to gather with your fellow participants, learn so many new things, and build some great memories.

​Each day, we’ll be joined by our wonderful communities, such as Shortcuts, and iOSDevHappyHour, to name a few. We’ll also be hosting a live recording of Swift over Coffee, focusing on everything new at WWDC.

Yes, you read that right – I’ll be hosting a Shortcuts/Apple Intelligence/App Intents meetup during one of the afternoons that week! Schedules will be announced later, and I’ll update this post plus create another when I know my official time slot.

Located just a few minutes away from Main Street Cupertino and the Visitor Center at Apple Park, this free conference is designed specifically to make it easy to know where to go if you’re in town for WWDC, merging past events like the watch party from the iOS Dev Happy Hour into one event.

You can watch the WWDC Keynote and State of the Union with developer friends on Monday, plus attend live podcast recordings, join community meetups like mine, and access a hackathon space to work on new ideas all day Tuesday & Wednesday.

To be clear: this means most social events are moving the from San Jose to being more focused in Cupertino this year, so folks don’t have to make their way back-and-forth across those 8 miles as much. This also means anyone coming from out-of-town or from San Francisco can stay/park at the Hyatt House each day and easily access most WWDC social events.

If you’re unsure if it’s worth coming to WWDC, let this post convince you – it’ll be a blast and you’ll have something fun to do to Monday, Tuesday, and Wednesday that week.

WWDC is back!2 Get your free ticket to CommunityKit now.


  1. Not to be confused with the now-defunct AltConf
  2. Yes, the official conference has been back for years. But I kept hearing people at Deep Dish Swift ask if the social WWDC is “back”.

    Yes, it is is! The social scene has been growing for a few years, but took a while to figure out better.

    Now, more of us are coordinating together to make it like the old days where, if you didn’t have a ticket, you could go to AltConf. Now, you can go to CommunityKit! 

 

 

  1.  
Categories
Developer Siri Shortcuts

Tune In To My Apple Intelligence Talk via Deep Dish Swift Live

I’m super excited to be giving my talk on Apple Intelligence live tomorrow at Deep Dish Swift – if you’re interested in tuning in to the conference stream, follow Deep Dish Swift on YouTube:

Check out Deep Dish Swift live and learn more about the conference.

Categories
Developer Livestreams Offsite

The Future of App Development: Artificial Intelligence and App Intents (feat. Connor Hammond)

From my stream with Connor Hammond on YouTube Live — tune in:

AI consultant and app developer Connor Hammond joins me to talk about the future of app development in a world of AI, particularly in relation to Apple’s App Intents APIs.

We discussed questions such as:

  • What does it mean to develop with AI at your side?
  • As AI tools speed up development, how can app developers harness new capabilities within Apple’s schedule?
  • If App Intents is Apple’s strategy, what does that mean for all AI platform?
  • How can apps take advantage of App Intents to deploy their functionality across Apple’s platforms?
  • How do AI-enabled apps provide more value than the AI tools themselves?

View the stream live or catch the replay on YouTube.

Categories
Announcements Developer Links News

Apple Is Delaying the ‘More Personalized Siri’ Apple Intelligence Features »

On Wednesday, March 5, I posted blog post “New developer APIs hint at ‘Personal Context’ for Apple Intelligence coming in iOS 18.4”. Today Friday, March 7, Apple said “Nope” to John Gruber – here’s the quote from Apple spokesperson Jacqueline Roy from his story on Daring Fireball:

“Siri helps our users find what they need and get things done quickly, and in just the past six months, we’ve made Siri more conversational, introduced new features like type to Siri and product knowledge, and added an integration with ChatGPT. We’ve also been working on a more personalized Siri, giving it more awareness of your personal context, as well as the ability to take action for you within and across your apps. It’s going to take us longer than we thought to deliver on these features and we anticipate rolling them out in the coming year.”

Gruber also gives his analysis of the situation, which you should read in full.

Oh, and those API pages? Gone.

View the whole story on Daring Fireball.

Categories
Developer News

New developer APIs hint at ‘Personal Context’ for Apple Intelligence coming in iOS 18.4

The first look at Personal Context for Apple Intelligence is here as APIs available in the iOS 18.4 developer betas allow apps to further their content for the system to understand. This sets the stage for the most significant update to Siri so far, where all your apps can provide Siri with the available views and content to work with – in a secure and private manner, too.

As first mentioned by Prathamesh Kowarkar on Mastodon, there is now a suite of APIs in beta that associate an app’s unique content, called an entity, with a specific view – this allows Siri to read what’s indexed on-screen and use it with other app’s actions when triggered by a command.

APIs like this are necessary for the coming Siri update to actually do what Apple says Apple Intelligence is capable of – now that the functionality is here, however, it’s up to developers to implement everything to make sure the experience works well.

Here are the new pages:

If these APIs are in beta now, it stands to reason they’ll leave beta after iOS 18.4 releases in full – which means Personal Context might be coming as early as iOS 18.4.

Check out the post from Kowarkar on Mastodon.

Update: Nope – it’s officially delayed.

Categories
Announcements Developer

Announcement: I’ll Be Speaking About Apple Intelligence at Deep Dish Swift This April

It’s my pleasure to announce that I’ll be a conference speaker at Deep Dish Swift in Chicago, which runs this April 27 to April 29, 2025!

My talk will be covering Apple Intelligence and the portion powered by App Intents APIs, in many ways acting as a follow-up to my just-posted talk “Preparing your App for Apple Intelligence” talk about App Intents from this past September at NSSpain.

I’ll be joining a wonderful group of speakers from the Apple developer community for three days of talks, a live recording of the Launched podcast, and a lot of deep dish pizza.

Check out all the details on the Deep Dish Swift website. If you’re an Apple developer, buy your ticket to attend – and, if you’re going, let me know and I’ll see you there!

Categories
Developer News

Preparing Your App for Apple Intelligence: My Conference Talk from NSSpain 2024

For anyone who is interested in Apple Intelligence and learning about the App Intents APIs that power it, but don’t know where to start, my conference talk from last year covers everything you’ll need to know to get up to speed.

Preparing Your App for Apple Intelligence,” delivered at at NSSpain 2024, covers everything to know prior to the launch of Apple Intelligence, including a brief history of Workflow, how we got from Shortcuts to App Intents, and a look forward at Apple Intelligence. Here’s the video on Vimeo, among all the 2024 talks:

My thanks to the folks at NSSpain for letting me open the conference with my first talk ever. I admittedly went slightly long, but only because I packed the presentation with loads of information – and a bit of nerves ?. Check out this set of photos from my talk:

The rest of the conference was a delight, including a wide array of great speakers and a fancy winery dinner, all set in Logroño, Spain during the town’s annual wine festival week – surely an event to remember.

If you’re a developer who’s interested in working with me to add Apple Intelligence support and App Intents to your app, contact me directly.

Watch the whole video on Vimeo, browse the set of NSSpain 2024 talks, and check out the NSSpain website.

Categories
Developer

How to integrate Siri and Apple Intelligence into your app to query onscreen content »

From Jordan Morgan, Developer Advocate at Superwall:

Since the announcement of Apple Intelligence, developers have been in a bit of holding pattern with iOS 18. Many of the marquee APIs to hook into Siri and Apple Intelligence weren’t available with its initial release. For example, hooking into onscreen content (here’s an example).

The ability to have Siri perform tasks from your app, or shuttle data from it to another one, based on your personal context is a game-changer. We’ve always been able to ask Siri to perform tasks in our apps, but Siri couldn’t understand the specifics of what you were currently looking at.

Take the example I linked to above, from Apple’s W.W.D.C. session, “Bring your app to Siri.” In it, the presenter asks Siri to favorite a photo. Apple Intelligence makes this flow better starting with iOS 18.2, since we can expose our app’s onscreen content. Basically, this opens up two things for developers:

  1. Primarily, you can create entities that Siri can understand, simply from them being visible on screen. The entity from the example above was a photo, and it was recognized when the presenter said “Hey Siri, move this photo to the California album.”
  2. And, you can create intents that Siri can use with that content, or from content coming to your app from somewhere else. Moving the photo to another album from the example would be the intent at play.

What’s key to understand here is that the photo is an entity Siri can use and moving it to an album was an intent. And, as a bonus – you can shuttle the data around easier with Transferable. This isn’t new but is important. Again, from the example, this is how Siri took the photo and sent it in an email.

Today, I’ll show you how it works and break down the key parts you’ll need. However, there’s an important catch: Siri can’t actually perform these tasks yet. While the APIs are available to developers, the personal context features aren’t live. Once they are, everything will ‘just work.’ For now, Apple seems to be giving us the tools today so we can be ready for tomorrow.

Friend of the site Jordan Morgan is back with yet another excellent guide around App Intents – this time preparing us for the real Apple Intelligence experience.

View the original.