Learn about Siri Shortcuts
Subscribe To My Channels
Read My Blog
Hello friends! It’s my pleasure to announce my second-annual WWDC meetup 1, this time as part of the free CommunityKit conference under the name “Apple Intelligence Automators” – sign up here for the free event on Tuesday, June 10 from 2:00 PM - 4:00 PM.

Located inside the Hyatt House San Jose / Cupertino at 10380 Perimeter Rd in Cupertino (just a few minutes from Apple Park), we’ll be discussing the announcements from the WWDC keynote address and State of the Union from the day prior as it relates to Apple Intelligence, App Intents, and Shortcuts.
With Apple Intelligence being the focus of last year’s WWDC, and delays on those features pushing things back, we should have plenty to talk about.
Check out the event page on Luma to register and don’t forget to get your free ticket to CommunityKit.
- I hosted a Shortcuts meetup last year – and had a blast. ↩
If you’re getting into development for Apple Intelligence, it can be hard to understand how to parse Apple’s documentation. App Intents, the API that powers the Actions and Personal Context features of Apple Intelligence, has been shipping since 2021, with a deeper history since the introduction of Shortcuts in 2018 – there are over 30 sessions to learn from.
Since I’ve been consulting with developers on their App Intents integrations, I’ve developed a Star Wars Machete Order-style guide for the Apple Intelligence developer sessions – watch the sessions in this order to best understand how Apple thinks about these APIs.
Apple Intelligence Machete Order
How to understand the App Intents framework
Start with the latest sessions from 2024, which reintroduces App Intents as it extends across the system in more ways, as well as updates the Design suggestions from their earlier framework:
- Bring your app’s core features to users with App Intents (2024)
- Design App Intents for System Experiences (2024) – directly replaces suggestions in Design great actions for Shortcuts, Siri, and Suggestions (2021)
Basics of App Intents and the importance of App Shortcuts
From there, once you have the context of how App Intents can be deployed, start back at the beginning to see how to implement App Intents, as well as App Shortcuts – for Siri the first year and then updated for Spotlight in the next.
- Dive into App Intents (2022)
- Design App Shortcuts (2022)
- Implement App Shortcuts with App Intents (2022)
- Spotlight your app with App Shortcuts (2023)
- Design shortcuts for Spotlight (2023)
- Explore enhancements to App Intents (2023)
Apple Intelligence sessions
Finally, once you understand the core of App Intents, what it used to be vs. what Apple wants you to do now, and how to deploy App Intents across Spotlight and Siri, move onto the latest updates for Apple Intelligence – new features that enable Personal Context, as well as integrating your intents into domains for Siri:
- What’s new with App Intents (2024)
- Bring your app to Siri (2024)
Good to know
Beyond that, it can be helpful to review earlier sessions to understand where Apple is coming from, as well learning about the lesser-known experience your app is capable of providing:
- Introducing parameters for Shortcuts (2019)
- Add configuration and intelligence to your widgets (2020)
- Evaluate and Optimize Voice Interactions For Your App (2020)
- Broaden your reach with Siri event suggestions (2020)
- Decipher and deal with common Siri errors (2020)
- Create quick interactions with Shortcuts on watchOS (2020)
- Donate intents and expand your app’s presence (2021)
- Meet Focus Filters (2022)
All the Apple Intelligence developer sessions
For good measure, here’s the full list of the Shortcuts / App Intents / Apple Intelligence developer sessions – I’ll update this list after WWDC’25 with new sessions (and create a new post):
- 2018
- 2019
- 2020
- What’s new with SiriKit and Shortcuts
- Empower your intents
- Evaluate and Optimize Voice Interactions For Your App
- Add configuration and intelligence to your widgets
- Design for Intelligence series
- Decipher and deal with common Siri errors
- Create quick interactions with Shortcuts on watchOS
- Broaden your reach with Siri event suggestions
- 2021
- 2022
- 2023
- 2024
- Human Interface Guidelines
- Documentation
Check out more Machine Learning and AI videos from the Apple Developer site, read the full App Intents documentation, and learn more about Apple Intelligence.
P.S. Let me know if you want help with your App Intents integration.
Earlier this evening, I was working on my soon-to-be-relaunched newsletter What’s New in Apple Intelligence, and I opened Reeder to find an article from Daring Fireball titled “15 Years Later: ‘Very Insight and Not Negative’.”
In the post, Gruber was recalling a quote from Steve Jobs’ comment about his blog, which he had inexplicably hadn’t linked to (until now) – upon reading that, I realized I had my own Tim Cook moment in a similar realm.
Six years ago, on February 28, 2019, I published a story for iMore where I collected over 100 apps that work with Siri, tweeting:
If you’ve been wanting to get into Siri Shortcuts but don’t know where to start...
Here are 100 apps with shortcuts you can add to Siri and use with your own custom phrase: https://x.com/iMore/status/1101285345390444545
The next day, at about 3pm, Tim Cook quoted my post, saying:
“Worth a look. Something for everyone.”
In the moment, I sent a simple “Thanks Tim!” back, but never posted about it on my blog. So, here we are – while you’re here, see what else I have that’s worth a look – there’s something for everyone.
Check out Tim Cook’s tweet and view the story on Daring Fireball.
Today is Global Accessibility Awareness Day (GAAD), which Apple highlighted in their press release showcasing accessibility features coming in the next year – plus a new Accessibility shortcut called Hold That Thought:
New features include Accessibility Nutrition Labels on the App Store, Magnifier for Mac, Braille Access, and Accessibility Reader; plus innovative updates to Live Listen, visionOS, Personal Voice, and more.
Near the end of the release, Apple explains their new shortcut, plus the addition of the previous Accessibility Shortcut to Vision Pro:
The Shortcuts app adds Hold That Thought, a shortcut that prompts users to capture and recall information in a note so interruptions don’t derail their flow. The Accessibility Assistant shortcut has been added to Shortcuts on Apple Vision Pro to help recommend accessibility features based on user preferences.
Here’s how Apple describes the shortcut:
Interruptions can cause you to forget tasks and affect productivity, especially for neurodivergent individuals.
When you run this shortcut, you have two options: Capture and Recall.
Run the shortcut and select Capture to capture a screenshot of what you’re doing, any calendar events in the next hour, current open webpage in Safari (Mac only), and Clipboard contents. You'll then be prompted to write short notes about what you are doing and what you are about to do. Run the shortcut again and select Recall to find the last created note with all the captured information. All notes will be saved with the title “Hold that thought” and the date and time saved.
Run this shortcut using Siri, or add it to the Control Center, Action button or to the Home Screen for quick access.
I love this idea, and the core concept matches the inspiration for my currently-secret app idea that I teased at the end of my Deep Dish Swift talk.
I do have a few suggestions for improvements to the shortcut, however:
- Remove the errant space in the Choose From Menu prompt between “Capture” and “or” – it says “Capture or recall last stopping point?”
- For both “📸 Capture” and “🔁 Recall” options Choose From Menu, Apple should add Synonyms for “Capture” and “Recall” – the emoji can cause issues when dictating to Siri (in general, I avoid emoji in Menus for this reason).
- Utilize the “Find Tabs” action for iOS instead of simple not adding any functionality for Safari on mobile; Apple’s use of only “Get Current Safari Tab” for Mac reminds me that they still have not added the set of Safari iOS actions added back in 2022 to macOS, and their absence in this shortcut furthers my belief that these highly-sought actions are deprioritized simply because the team doesn’t use iOS as often and this Mac action is “good enough”.
- The second “Recall” option just opens the note, but I’d rather see that last item I saved – Apple should have gone further to isolate the recent item and display the recalled information, not just open it again. I tried to Recall from my Apple Watch and the shortcut simply failed.
- The flow of an alert, a 5-second countdown before a screenshot, and two prompts might be too long for most neurodivergent people to capture information effectively while in the process of being interrupted.
To improve the shortcut as it is today, I’d simply remove the Show Alert and Wait actions, and assign this new shortcut to the Action button – that way you can immediately take a screenshot, then answer the prompts, and move on.
Going further, I’d love to see a new version of this next year once Apple Intelligence ships in full, which utilizes “Get On-Screen Content” and accesses all the data available from apps for Personal Context.
Get “Hold That Thought” for Shortcuts, view the announcement from the Apple Newsroom, and check out past updates from GAAD.
Last Tuesday, I gave a talk to over 300 developers at Deep Dish Swift about Apple Intelligence, where I made the following claim:
Apple will win the AI race
I'm an expert on App Intents, the API that powers the yet-to-be-seen features of Apple Intelligence – Actions and Personal Context. After designing implementations with my clients, and seeing the trends around AI-assisted coding, hearing rumors of an iOS 19 redesign, and seeing the acceleration effects of artificial intelligence, I believe Apple is skating to where the puck will be, rather than where it is now.
I'll leave the thesis for the talk – but if you're building for any Apple devices, you'll want to understand how important App Intents is to the future of the platform:
Watch the 54-minute talk from Deep Dish Swift on YouTube Live.
If you’re interested in “going” to WWDC, but don’t have a developer ticket – you should sign up for CommunityKit, the alternative conference1 for Apple developers, media, and fans.
From June 9 though 11, join us at the Hyatt House Cupertino to gather with your fellow participants, learn so many new things, and build some great memories.
Each day, we’ll be joined by our wonderful communities, such as Shortcuts, and iOSDevHappyHour, to name a few. We'll also be hosting a live recording of Swift over Coffee, focusing on everything new at WWDC.
Yes, you read that right – I’ll be hosting a Shortcuts/Apple Intelligence/App Intents meetup during one of the afternoons that week! Schedules will be announced later, and I’ll update this post plus create another when I know my official time slot.

Located just a few minutes away from Main Street Cupertino and the Visitor Center at Apple Park, this free conference is designed specifically to make it easy to know where to go if you’re in town for WWDC, merging past events like the watch party from the iOS Dev Happy Hour into one event.
You can watch the WWDC Keynote and State of the Union with developer friends on Monday, plus attend live podcast recordings, join community meetups like mine, and access a hackathon space to work on new ideas all day Tuesday & Wednesday.

To be clear: this means most social events are moving the from San Jose to being more focused in Cupertino this year, so folks don’t have to make their way back-and-forth across those 8 miles as much. This also means anyone coming from out-of-town or from San Francisco can stay/park at the Hyatt House each day and easily access most WWDC social events.
If you’re unsure if it’s worth coming to WWDC, let this post convince you – it’ll be a blast and you’ll have something fun to do to Monday, Tuesday, and Wednesday that week.
WWDC is back!2 Get your free ticket to CommunityKit now.
- Not to be confused with the now-defunct AltConf. ↩
- Yes, the official conference has been back for years. But I kept hearing people at Deep Dish Swift ask if the social WWDC is “back”.
Yes, it is is! The social scene has been growing for a few years, but took a while to figure out better.
Now, more of us are coordinating together to make it like the old days where, if you didn’t have a ticket, you could go to AltConf. Now, you can go to CommunityKit! ↩
I'm super excited to be giving my talk on Apple Intelligence live tomorrow at Deep Dish Swift – if you're interested in tuning in to the conference stream, follow Deep Dish Swift on YouTube:
Check out Deep Dish Swift live and learn more about the conference.
In the process of switching my mounted overhead video setup from a backdrop bar to the Elgato Multi-Mount, I had to make one significant shift – filming upside-down, since the camera is now attached to the back of the desk instead of mounted above from the front. Unfortunately, that means all of my footage needs to be rotated before being usable in editing programs.
In Final Cut Pro for Mac, you can easily rotate clips once you’ve added them to the timeline. However, I’m not actively building a story yet, and I’m instead using the Browser to organize my footage into individual clips using in/out points and Favorites. For a long recording like an unboxing, I can turn an hour of footage into a full set of individual moments as clips, all timed exactly for the action, renamed as needed, and split apart as separate entities in the Browser.
This process and my footage means, by default, all my Browser clips are also upside-down, and at first glance this seemed like a big problem for my editing style – timeline editing is very different than clipping in the Browser, and I might be out of luck.
However, thanks to “2old2care” on Reddit (great username), the solution lies in the “Open Clip” menu option, which I’ve never used before:
Yes, you can invert the clip in the browser. Select the clip, then under "Clip" menu select "Open Clip". You can then go to transform and rotate the clip 180º. I don't know of a way to create a batch in FCP to do this, although it can be done for the original clips using Compressor.
To save myself the trouble of remembering later, I took screenshots of the process – here’s my setup in Organize mode (under Window > Workspaces > Organize):
How to rotate clips within the Browser using Final Cut Pro
- Select the clip you want to rotate – use the filmstrip to identify which files were filmed upside-down.
- In the Menu Bar, navigate to Clip > Open Clip, which has no keyboard shortcut. Optionally, assign a keyboard shortcut under Final Cut Pro > Command Sets > Customize (or use ⌥ + ⌘ + K / Option + Command + K to customize immediately).
- In the Final Cut Pro window, the selected clip will open in its own timeline view. In the Inspector, select Transform and change the Rotation from 0° to 180°.
- In the center of the window, find the clip name and click the dropdown arrow next to it to reveal a context menu – close the clip to return to the full Browser view. The filmstrip will show the flipped clip as you scroll, however it will continue to show the original upside-down version in the static filmstrip until you leave the project and navigate back/refresh the window.
- Repeat for each upside-down clip.
As 2old2care mentioned, batch-processing files like this would be a more ideal solution – I’ll update this post if I find one.
Check out the source on Reddit, get the Multi-Mount from Elgato, and get Final Cut Pro for Mac from Apple.
I've just added a new folder to the Shortcuts Library — my set of Perplexity shortcuts for asking Perplexity to do research for you.
Use these to open the sections of the website, ask questions in new threads on iPhone and iPad, interact with the Mac app using keyboard shortcuts, go deeper on the Perplexity experience, and interact with the API:
Website
- Open Perplexity AI: Opens the website for Perplexity AI in your default browser.
- Open Perplexity Discover: Opens the Discover page from Perplexity, which curates top stories for you and summarizes them.
- Open my Spaces in Perplexity: Opens the Spaces section of Perplexity, where you can create research and collaboration hubs built on top of Perplexity search.
- Open Perplexity Library: Opens the Library section of Perplexity, where you can see Threads and Pages around searches you’ve performed.
iOS and iPadOS app
- New Search in Perplexity: Opens Perplexity to a new, blank search using the Auto mode.
- New Pro Search in Perplexity: Opens Perplexity to a new, blank search set to Pro mode, which acts as your conversational search guide. “Instead of quick, generic results, Pro Search engages with you, fine-tuning its answers based on your needs.”
- Ask Perplexity: Prompts you to “Ask anything” before opening into Perplexity to search for your query.
- Play Perplexity Discover: Immediately starts a Live Activity session for Perplexity Discover, using stories drawn from the Discover feed and spoken by ElevenLab’s voices.
- Summarize articles with Perplexity: Creates a series of Threads in Perplexity for URLs shared as input, either from the Share Sheet or by detecting what’s on screen. Includes logic for multiple links, opening each URL in the background until the final query.
Mac app
- Set up Perplexity for Mac: Opens the Mac app for Perplexity AI, resizing the window to 1024×770 and moving it to the center of the current display.
- New Thread in Perplexity: Simulates the keyboard shortcut for Command + Shift + P, which activates the Perplexity search bar from anywhere.
- Voice Mode in Perplexity: Simulates the keyboard shortcut for Command + Shift + M, which activates the Perplexity voice mode in a popover window.
- Upload File in Perplexity: Simulates the keyboard shortcut for Command + Shift + U, which activates the Perplexity upload process and shows a Finder window where you can select the file to upload.
- Voice Dictation in Perplexity: Simulates the keyboard shortcut for Command + Shift + D, which activates the Perplexity voice dictation in the search bar so you can enter a query hands-free.
- Screen Capture with Perplexity: Simulates the keyboard shortcut for Command + Shift + 0, which activates the Perplexity screen capture and prompts whether to capture an Area, Window, or Fullscreen.
Deep Dive
- Open my Perplexity account settings: Opens the Perplexity website to Settings > Account, where you can change general settings like the Appearance, as well as subscription details or system settings.
- Edit my Perplexity profile: Opens the Perplexity settings to the Profile section, where you can tell Perplexity information about yourself and your preferences to inform results.
- Read the Perplexity blog: Opens the Perplexity blog, where you can see stories and announcements from the team on new updates or changes to the service.
- Open the Perplexity discord: Opens the deep link into Discord for the Perplexity channel using the unique ID and channel ID.
- Show the Perplexity Supply store: Opens the website for Perplexity Supply, the clothing line for fans of Perplexity.
API
- Open the Perplexity API docs: Opens the URL for the Perplexity API documentation, so you can quickly reference how to get started or learn about the API.
- Open the Perplexity API reference: Opens the Perplexity API website to the API reference, starting with Chat Completions, you can test your commands against the API and see what’s working.
- Get my Perplexity API key: Stores your API key for Perplexity. Store the result as base64-encoded text so it’s not readable as plain text, which is then decoded as this is run.
- Manage my Perplexity API keys: Opens the Perplexity website to your API settings, where you can manage API keys and payment details.
Check out the folder of Perplexity shortcuts on the Shortcuts Library.
The first look at Personal Context for Apple Intelligence is here as APIs available in the iOS 18.4 developer betas allow apps to further their content for the system to understand. This sets the stage for the most significant update to Siri so far, where all your apps can provide Siri with the available views and content to work with – in a secure and private manner, too.
As first mentioned by Prathamesh Kowarkar on Mastodon, there is now a suite of APIs in beta that associate an app’s unique content, called an entity, with a specific view – this allows Siri to read what’s indexed on-screen and use it with other app’s actions when triggered by a command.
APIs like this are necessary for the coming Siri update to actually do what Apple says Apple Intelligence is capable of – now that the functionality is here, however, it’s up to developers to implement everything to make sure the experience works well.
Here are the new pages:
- appIntentsDataSource
- NSCollectionViewAppIntentsDataSource
- UICollectionViewAppIntentsDataSource
- appEntityIdentifier(forSelectionType:identifier:)
- appEntityIdentifier(_:)
If these APIs are in beta now, it stands to reason they’ll leave beta after iOS 18.4 releases in full – which means Personal Context might be coming as early as iOS 18.4.
Check out the post from Kowarkar on Mastodon.