From Jordan Morgan, Developer Advocate at Superwall:
Since the announcement of Apple Intelligence, developers have been in a bit of holding pattern with iOS 18. Many of the marquee APIs to hook into Siri and Apple Intelligence weren’t available with its initial release. For example, hooking into onscreen content (here’s an example).
The ability to have Siri perform tasks from your app, or shuttle data from it to another one, based on your personal context is a game-changer. We’ve always been able to ask Siri to perform tasks in our apps, but Siri couldn’t understand the specifics of what you were currently looking at.
Take the example I linked to above, from Apple’s W.W.D.C. session, “Bring your app to Siri.” In it, the presenter asks Siri to favorite a photo. Apple Intelligence makes this flow better starting with iOS 18.2, since we can expose our app’s onscreen content. Basically, this opens up two things for developers:
Primarily, you can create entities that Siri can understand, simply from them being visible on screen. The entity from the example above was a photo, and it was recognized when the presenter said “Hey Siri, move this photo to the California album.”
And, you can create intents that Siri can use with that content, or from content coming to your app from somewhere else. Moving the photo to another album from the example would be the intent at play.
What’s key to understand here is that the photo is an entity Siri can use and moving it to an album was an intent. And, as a bonus – you can shuttle the data around easier with Transferable. This isn’t new but is important. Again, from the example, this is how Siri took the photo and sent it in an email.
Today, I’ll show you how it works and break down the key parts you’ll need. However, there’s an important catch: Siri can’t actually perform these tasks yet. While the APIs are available to developers, the personal context features aren’t live. Once they are, everything will ‘just work.’ For now, Apple seems to be giving us the tools today so we can be ready for tomorrow.
Friend of the site Jordan Morgan is back with yet another excellent guide around App Intents – this time preparing us for the real Apple Intelligence experience.
From Jordan Morgan, developer advocate at Superwall:
Let’s add a new file, and call it GetCaffeineIntent.swift:
struct GetCaffeineIntent: AppIntent {
static var title = LocalizedStringResource("Get Caffeine Intake")
static var description = IntentDescription("Shows how much caffeine you've had today.")
func perform() async throws -> some IntentResult {
let store = CaffeineStore.shared
let amount = store.amountIngested
return amount
}
}
This has all of the three things we mentioned above:
It has a title (“Get Caffeine Intake”).
A description of what happens when we use it (“Shows much much caffeine you’ve had today.”)
And, an implementation of that action, vended via the perform function.
However, if we build and run — we’ll get a compiler error:
`Return type of instance method 'perform ()' requires that 'Double' conform to 'IntentResult'`
Looking at the return type, it’s some IntentResult. This is critical to understand to avoid a lot of undue frustration with App Intents. You always return some form of an IntentResult. For example, if your intent just does an action, and has nothing of value to say about that action — you can simply return .result(). You don’t ever return some primitive or domain specific type like we’ve done above.
Ours, though? It would be useful to tell the user how much caffeine they’ve had and return the actual amount, so change the return type to mark the intent to return two things:
An actual Double value of how much caffeine has been consumed.
And, some dialog to speak out their caffeine for the day.
So, instead of some IntentResult, here’s what we need:
func perform() async throws -> some IntentResult & ReturnsValue<Double> & ProvidesDialog {
let store = CaffeineStore.shared
let amount = store.amountIngested
return .result(value: amount,
dialog: .init("You've had \(store.formattedAmount(for: .dailyIntake))."))
}
Each intent’s return type needs to start with some Intent opaque return type, but from there we can also include more specific types. Here, we’ve noted that we return a double value and speak out dialog.
Developers should read the entire post, but I highlighted this portion because it’s fairly poorly documented and incredibly important.
Jordan also covers basic Intent setup, more on the Entity front, and using Siri Tips and Shortcuts Links to bring more visibility to your actions.
Today I just happened to stumble across the Apple Developer page for “Shortcuts for developers”, which was first launched in July 2023, that Apple designed as a landing page for all things, well, Shortcuts and development for it.
Here’s how Apple pitches Shortcuts here:
Increase your app’s surface area and help users quickly access the most important views and actions in your app. With no user setup required, App Shortcuts are available as soon as your app is installed in iOS, iPadOS, visionOS, or watchOS and can be run from Spotlight, the Home Screen, the Shortcuts app, or even by using your voice with Siri.
Underneath that, Apple calls attention to App Intents, which they describe this way:
Enable shortcuts with App Intents, a Swift-only framework designed to make it faster and easier to build great actions that people can access throughout the system.
The page also links to the documentation to help you begin implementing App Intents with these starting points:
I’m glad Apple has made this resource for developers unfamiliar with Shortcuts and App Intents, as it’s a clear jumping-off point while emphasizing the value of Shortcuts and what apps can enable for their users.
I hope to see this page updated for App Intents in iOS 18 and Apple Intelligence, clarifying the connection between what you can do with Siri and how it’s all going to be available in the Shortcuts app as well.
In order to model this in an App Entity using parameterSummary, we would then need a different summary for every task type (“automatic”, “specific”, “next on page”, “next in category”), as well as accounting for each chart type. This is 8 combinations.
Now consider another of our widgets: the “Tasks” widget. This lets you choose up to 4 different tasks, and has the same options:
In this case, there would be 16 combinations, which really doesn’t scale well. It’s extremely hard to maintain and is inflexible if future changes are needed.
To solve this, I introduced a new type called TaskTypeAppEntity, which encapsulates the four different types (automatic, specific, next on page, next in category) in a single entity:
struct TaskTypeAppEntity: AppEntity, Identifiable {
static let typeDisplayRepresentation: TypeDisplayRepresentation = "Task Type"
let id: TaskTypeAppEntityIdentifier
var title: String
var displayRepresentation: DisplayRepresentation {
.init(title: .init(stringLiteral: title))
}
static let defaultQuery = TaskTypeAppEntityQuery()
}
enum TaskTypeAppEntityIdentifier {
case automatic
case task(TaskID)
case category(CategoryID)
case page(PageID)
}
When building the defaultQuery, it’s just a case of including all of the options in that query. You can even group it into separate sections:
This is more code than I’ve probably ever shared on my site, but this method is incredible for apps with more complex data models to build a clean App Intents experience for their users – all App Intents developers should take a look and see if they can use this.
I’ll definitely be recommending this to my clients – I had to save this here on my blog, as well as rewrite my own headline to clarify that this is useful beyond widget configurations as well as beyond moving from “INIntents” to App Intents.
Developers, indies, and organizations – let’s work together to make your App Intents and Apple Intelligence implementation the best!
I’m now available for consultation work on Apple’s App Intents and Shortcuts implementations for app developer of all sizes, whether you’re a team, an indie dev, or part of a larger organization.
With my 10+ years of experience working in Shortcuts (as a former member of the Workflow team), I’ve built out a consulting process to educate, strategize, design, and build on top of a complete App Intents and Shortcuts implementation for your app.
I’ll explain the history of Shortcuts, how we got from Workflow to Apple Intelligence, walk you through the developer sessions, recommend intents specific to your needs, share personalized feedback on your implementations, provide extensive documentation and copywriting, and even create any number of custom workflows on top of what we build together, extending your app to the ecosystem with hand-built third-party integrations.
Plus, we can do everything mentioned above together, or just a few parts – let’s schedule a one-hour meeting and discuss what you need.
Pricing depends varies per engagement, but I’m happy to customize anything depending on your goals and timeline.
Now is the time – let’s build App Intents for the rest of us.
Here’s the description from Apple (with line breaks added for readability):
Learn about improvements and refinements to App Intents, and discover how this framework can help you expose your app’s functionality to Siri and all-new features.
We’ll show you how to make your entities more meaningful to the platform with the Transferable API, File Representations, new IntentFile APIs, and Spotlight Indexing, opening up powerful functionality in Siri and the Shortcuts app.
Empower your intents to take people deep into your app with URL Representable Entities.
Explore new techniques to model your entities and intents with new APIs for error handling and union values.
Before I begin, this video also also assumes you’re already aware of the basics of the App Intents API, building off the main sessions from 2022 and 2023 (which you should watch if you haven’t yet).
In many ways, the extended lifespan of this API is why Apple has released other new videos to explain why you should add App Intents and how to design them (which I’ll cover in future posts) in an updated context for 2024.
In reintroducing App Intents in 2024, developer Kenny York starts off the video emphasizing the variety of experience App Intents already powers, including Shortcuts, Spotlight, widgets, the Action button, and the new Apple Pencil squeeze (introduced in May 2024).
This year, those experiences now extend to includes Apple Intelligence and the new Controls experience in Control Center (both covered in future posts).
The App Intents framework itself is also expanding beyond intents and App Shortcuts to emphasize entities, helping the system understand and use them directly when invoked. And finally, developer improvements are designed at making it easier to make App Intents, so they’re not as hard to implement now that developers will be building a lot more.
By default, opening Spotlight on your device already creates a rich experience showing content you might want to see, from suggested apps and actions to a daily summary or recent searches.
Spotlight also helps you search for specific content inside an app (including a new preview experience that I’ve yet to discover how to design on your own).
Last year, Apple improved this experience by allowing developers to create App Shortcuts specifically for the Spotlight experience (as covered in their “Explore enhancement to App Intents” video) and emphasize the most important actions (and entities) from their apps.
IndexedEntity protocol
In order to take this even further, Apple has introduced a new IndexedEntity protocol that lets you index the entities from an app, making it available to the new powerful semantic search capabilities of your device new in iOS 18.
With this protocol, developers can create an index of all the items (or entities) from their app, giving each item a set of attributes, (including keywords), and even assign them priority to match features like favorites lists.
Then, by handing it off to Spotlight when the app is launched or updated, all that data is indexed and searchable, plus more easily matched when queried using natural language.
In order to add an extra layer of meaning beyond the App Entity and attributes, Apple is allowing developers to declare how their entities can be converted into new formats, which formats their App Intents can accept, and also securely provide files themselves in lieu of direct access.
Transferable AppEntity
Another issue with expanding the AppEntity protocol and making all the items in your apps accessible via AppIntents is what ends up being a format issue – how does data from one app properly convert into the right data for another app? In their example, a trail entity from their app could become a PDF in another when passed via App Intents.
To help with this, Apple now allows developers to extend their entities with Transferable, an API introduced in 2022 designed to convert formats from one type to another.
With Transferable, developers can actually specific which types of data they want their entity to convert to, including fallback options sorted by priority.
For anyone with a long history of Shortcuts and its original independent version Workflow, this acts like a modern-day version of the Content Graph engine, which is what powered Workflows inputs/outputs to intelligently convert data types when needed. Now, it’s built into App Intents, powered by a Swift API, and developers have control over which data types it gets converted into and when.
Improvements to IntentFile
On the flip side, when an App Intent is designed to receive an entity, developers can specify which of these content types it can support, what’s available from the other app, and which one to pick.
Plus, when that content type isn’t supported, it can give the app the original URL of the content and pass it to Transferable for conversion.
FileEntity
Finally, Apple has also addressed limitations for document-based apps where the file itself is the entity, and not a representation of an object from your database – in these cases, the FileEntity API is needed to operate on the actual file itself.
In these cases, FileEntity allows Siri and Shortcuts to securely access the files, perform the operation, and pass it along to the next step using an ID instead of the file itself.
With this method, apps can provide information about their files and other apps can operate on those files, but handled properly through App Intents instead of direct access.
Beyond making it easier to meaningfully search for content within your app, Apple is also improving deep linking to those specific entities using Universal Links.
While the slide above is in full developer-speak, what Apple has introduced this year is the ability to generate and open links to any item, page, or action within your app. In essence, every part of every app is now represented by a URL – which makes sense, since it is a universal resource locator.
Now, App Intents allows URL to redirect to everything section, every thing you see, and every action you can take in your app.
In practice, this will mostly be used by developers as the universal method of accessing said item from another action step in Shortcuts, another app when it is requesting Transferable operate on an entity, or specifying to open content after creating it anew.
However, I’m fascinated by the larger implications here – everything in every app is linkable now… I’ll have to do more research to really flesh out what this means for apps that are also web services.
Now that developers can use App Intents to index all of their entities, make them meaningful to Spotlight and Siri, and interlink between any entity, action, or section, there’s a lot of App Intents work to be done by developers.
Thankfully, Apple already designed App Intents from the ground up to make the entire framework functional in code only (compared to the Xcode UI for custom intents), extendable by only requiring you to add onto existing App Intents implementations, and, in most cases, already in-use for features like widgets.
This year, they’re building off the improvements of the last few years in two specific ways – a new UnionValue macro that makes multiple parameters or properties representable under one type, and the ability for Xcode 16.0 to generate title strings automatically.
UnionValue
For developers, UnionValue means they can let users choose between more complex data types in a union under a new, single value – Apple describes it as an “or” parameter.
Generated titles
And for developers creating lots of App Intents can avoid duplicating unnecessary work in designing titles for their intents that are already implied in the struct’s property – instead of adding the title “Location” to a location, Xcode can simply figure it out. And when developers still want that control to specify a unique title, they can add it back in.
Framework improvements
Finally, in order to make App Intents work better with the new changes to App Entities, Apple has removed limitations on the interaction between the two and lets you define app entities in a framework and reference them from your app and extension targets.
However, it should be noted that Apple hasn’t expanded this to libraries outside of a framework – for now, libraries are not supported.
Apple also explicitly mentioned that not all the changes to App Intents are covered in this video, as some changes are directly relevant to other sessions on Apple Intelligence and Controls, and those changes are only covered in those sessions.
Here’s the full list from the graphic above for any developers searching through the App Intents documentation for everything new:
This year, Apple has expanded the App Intents framework to act as an all-encompassing tool from defining intents and building App Shortcuts on top of them to fully indexing all the entities from your apps, making them meaningful, linkable, and transferable, and in general making it easier to develop, integrate, and maintain a balanced App Intents integration with less effort or roadblocks.
Developers For developers, this means making your app, its objects, capabilities, and interface accessible to the rest of the ecosystem in a thoughtful and secure way.
Users For users, this means all your apps have added a layer that Apple Intelligence can understand, you can control from within the Shortcuts app, and will continue to be developer against for years to come.
Apple For Apple, this means that Siri and your devices can finally understand what your apps can do, what you’ve created with them, and the connections between it all – and in a private way that doesn’t allow apps, users, or even themselves to abuse that access.
In many ways, App Intents is a new operating system for apps themselves, letting them talk to the system, tell it what’s possible, and be ready for anything when the time comes.
Apple is also pushing App Intents in a huge way – I’ll cover it in the next session, but they’re declaring “Everything in your app should be an App Intent.” That means this is the API for how your apps work now, and will be going forward.
For most Apple users, they’ll simply experience App Intents by way of better Spotlight, the enhanced Siri integrations, and via features like widgets or the Action button, never having heard of this technology powering it all.
But behind the scenes, App Intents will be working hard to make sure everything functions properly – Siri will have gotten better “overnight,” while Apple has been building up App Intents for years to get us to here.
Developers, are you looking for help with your App Intents implementation?
Matthew Cassinelli is an expert on Shortcuts that has been writing articles and making YouTube videos about the subject for years. He was nice enough to give a presentation on App Intents and Shortcuts for developers at our May event.
iOSDevHappyHour🍻 is a place where current and aspiring iOS devs can come together and enjoy a good time once a month.