This content is marked as members-only – you’ll need a membership to access it.

This content is marked as members-only – you’ll need a membership to access it.
In iOS 26, Apple is adding a series of exciting new actions to Shortcuts, with a heavy focus on Apple Intelligence including direct access to their Foundation Models with the new Use Model action.
Alongside that, Apple has actions for Writing Tools, Image Playground, and Visual Intelligence, plus the ability to Add Files to Freeform and Notes, Export in Background from the iWork apps, and new Find Conversation & Find Messages actions for the Messages app, among others.
Plus, new updates to current actions—like turning Show Result into Show Content—make existing functionality easier to understand.
Here’s everything that’s new – available now in Public Beta:
The major focus of actions in iOS 26 is access to Apple Intelligence, both directly from the Foundation Models and indirectly through pre-built Writing Tools actions and Image Playground actions – plus a simple “Open to Visual Intelligence” action that seems perfectly suited for the Action button.
Apple has added new actions for system apps and features, starting with an interesting Search action that pulls in a set number of results, similar to Spotlight.
Both Freeform and Notes got “Add File” actions, plus you can add directly to checklists in Notes now too. Apple put the Background Tasks to work with exporting from iWork apps, and nice-to-have actions for Sports, Photos, and Weather make it easier to take advantage of those apps.
Particularly nice is Find Conversations and Find Messages, the former of which works well with Open Conversation, and the latter of which is a powerful search tool.
Apple continues to make Shortcuts actions easier to understand and adopt for new users, making small tweaks like clarifying Show Content and Repeat with Each Item.
Plus, existing actions like Calculate Expression, Translate, and Transcribe have benefitted from system-level improvements:
These new actions are available now in Public Beta—install at your own risk—and will be fully available in the fall once iOS 26 releases.
There are also further improvements on the Mac, which gained Automations in Shortcuts—including unique File, Folder, and Drive automations only available on Mac—plus the ability to run actions directly in Spotlight. I’ll cover these in future stories – be sure check the features out if you’re on the betas.
I will update this post if any more actions are discovered in future betas, or if there’s anything I’ve missed here.
P.S. See Apple’s video “Develop for Shortcuts and Spotlight with App Intents” for the example shortcut in the header photo.
Hello friends! It’s my pleasure to announce my second-annual WWDC meetup 1, this time as part of the free CommunityKit conference under the name “Apple Intelligence Automators” – sign up here for the free event on Tuesday, June 10 from 2:00 PM – 4:00 PM.
Located inside the Hyatt House San Jose / Cupertino at 10380 Perimeter Rd in Cupertino (just a few minutes from Apple Park), we’ll be discussing the announcements from the WWDC keynote address and State of the Union from the day prior as it relates to Apple Intelligence, App Intents, and Shortcuts.
With Apple Intelligence being the focus of last year’s WWDC, and delays on those features pushing things back, we should have plenty to talk about.
Check out the event page on Luma to register and don’t forget to get your free ticket to CommunityKit.
If you’re getting into development for Apple Intelligence, it can be hard to understand how to parse Apple’s documentation. App Intents, the API that powers the Actions and Personal Context features of Apple Intelligence, has been shipping since 2021, with a deeper history since the introduction of Shortcuts in 2018 – there are over 30 sessions to learn from.
Since I’ve been consulting with developers on their App Intents integrations, I’ve developed a Star Wars Machete Order-style guide for the Apple Intelligence developer sessions – watch the sessions in this order to best understand how Apple thinks about these APIs.
Start with the latest sessions from 2024, which reintroduces App Intents as it extends across the system in more ways, as well as updates the Design suggestions from their earlier framework:
From there, once you have the context of how App Intents can be deployed, start back at the beginning to see how to implement App Intents, as well as App Shortcuts – for Siri the first year and then updated for Spotlight in the next.
Finally, once you understand the core of App Intents, what it used to be vs. what Apple wants you to do now, and how to deploy App Intents across Spotlight and Siri, move onto the latest updates for Apple Intelligence – new features that enable Personal Context, as well as integrating your intents into domains for Siri:
Beyond that, it can be helpful to review earlier sessions to understand where Apple is coming from, as well learning about the lesser-known experience your app is capable of providing:
For good measure, here’s the full list of the Shortcuts / App Intents / Apple Intelligence developer sessions – I’ll update this list after WWDC’25 with new sessions (and create a new post):
Check out more Machine Learning and AI videos from the Apple Developer site, read the full App Intents documentation, and learn more about Apple Intelligence.
P.S. Let me know if you want help with your App Intents integration.
Earlier this evening, I was working on my soon-to-be-relaunched newsletter What’s New in Apple Intelligence, and I opened Reeder to find an article from Daring Fireball titled “15 Years Later: ‘Very Insight and Not Negative’.”
In the post, Gruber was recalling a quote from Steve Jobs’ comment about his blog, which he had inexplicably hadn’t linked to (until now) – upon reading that, I realized I had my own Tim Cook moment in a similar realm.
Six years ago, on February 28, 2019, I published a story for iMore where I collected over 100 apps that work with Siri, tweeting:
If you’ve been wanting to get into Siri Shortcuts but don’t know where to start…
Here are 100 apps with shortcuts you can add to Siri and use with your own custom phrase: https://x.com/iMore/status/1101285345390444545
The next day, at about 3pm, Tim Cook quoted my post, saying:
“Worth a look. Something for everyone.”
In the moment, I sent a simple “Thanks Tim!” back, but never posted about it on my blog. So, here we are – while you’re here, see what else I have that’s worth a look – there’s something for everyone.
Check out Tim Cook’s tweet and view the story on Daring Fireball.
Today is Global Accessibility Awareness Day (GAAD), which Apple highlighted in their press release showcasing accessibility features coming in the next year – plus a new Accessibility shortcut called Hold That Thought:
New features include Accessibility Nutrition Labels on the App Store, Magnifier for Mac, Braille Access, and Accessibility Reader; plus innovative updates to Live Listen, visionOS, Personal Voice, and more.
Near the end of the release, Apple explains their new shortcut, plus the addition of the previous Accessibility Shortcut to Vision Pro:
The Shortcuts app adds Hold That Thought, a shortcut that prompts users to capture and recall information in a note so interruptions don’t derail their flow. The Accessibility Assistant shortcut has been added to Shortcuts on Apple Vision Pro to help recommend accessibility features based on user preferences.
Here’s how Apple describes the shortcut:
Interruptions can cause you to forget tasks and affect productivity, especially for neurodivergent individuals.
When you run this shortcut, you have two options: Capture and Recall.
Run the shortcut and select Capture to capture a screenshot of what you’re doing, any calendar events in the next hour, current open webpage in Safari (Mac only), and Clipboard contents. You’ll then be prompted to write short notes about what you are doing and what you are about to do. Run the shortcut again and select Recall to find the last created note with all the captured information. All notes will be saved with the title “Hold that thought” and the date and time saved.
Run this shortcut using Siri, or add it to the Control Center, Action button or to the Home Screen for quick access.
I love this idea, and the core concept matches the inspiration for my currently-secret app idea that I teased at the end of my Deep Dish Swift talk.
I do have a few suggestions for improvements to the shortcut, however:
To improve the shortcut as it is today, I’d simply remove the Show Alert and Wait actions, and assign this new shortcut to the Action button – that way you can immediately take a screenshot, then answer the prompts, and move on.
Going further, I’d love to see a new version of this next year once Apple Intelligence ships in full, which utilizes “Get On-Screen Content” and accesses all the data available from apps for Personal Context.
Get “Hold That Thought” for Shortcuts, view the announcement from the Apple Newsroom, and check out past updates from GAAD.
Last Tuesday, I gave a talk to over 300 developers at Deep Dish Swift about Apple Intelligence, where I made the following claim:
Apple will win the AI race
I’m an expert on App Intents, the API that powers the yet-to-be-seen features of Apple Intelligence – Actions and Personal Context. After designing implementations with my clients, and seeing the trends around AI-assisted coding, hearing rumors of an iOS 19 redesign, and seeing the acceleration effects of artificial intelligence, I believe Apple is skating to where the puck will be, rather than where it is now.
I’ll leave the thesis for the talk – but if you’re building for any Apple devices, you’ll want to understand how important App Intents is to the future of the platform:
Watch the 54-minute talk from Deep Dish Swift on YouTube Live.
If you’re interested in “going” to WWDC, but don’t have a developer ticket – you should sign up for CommunityKit, the alternative conference1 for Apple developers, media, and fans.
From June 9 though 11, join us at the Hyatt House Cupertino to gather with your fellow participants, learn so many new things, and build some great memories.
Each day, we’ll be joined by our wonderful communities, such as Shortcuts, and iOSDevHappyHour, to name a few. We’ll also be hosting a live recording of Swift over Coffee, focusing on everything new at WWDC.
Yes, you read that right – I’ll be hosting a Shortcuts/Apple Intelligence/App Intents meetup during one of the afternoons that week! Schedules will be announced later, and I’ll update this post plus create another when I know my official time slot.
Located just a few minutes away from Main Street Cupertino and the Visitor Center at Apple Park, this free conference is designed specifically to make it easy to know where to go if you’re in town for WWDC, merging past events like the watch party from the iOS Dev Happy Hour into one event.
You can watch the WWDC Keynote and State of the Union with developer friends on Monday, plus attend live podcast recordings, join community meetups like mine, and access a hackathon space to work on new ideas all day Tuesday & Wednesday.
To be clear: this means most social events are moving the from San Jose to being more focused in Cupertino this year, so folks don’t have to make their way back-and-forth across those 8 miles as much. This also means anyone coming from out-of-town or from San Francisco can stay/park at the Hyatt House each day and easily access most WWDC social events.
If you’re unsure if it’s worth coming to WWDC, let this post convince you – it’ll be a blast and you’ll have something fun to do to Monday, Tuesday, and Wednesday that week.
WWDC is back!2 Get your free ticket to CommunityKit now.
I’m super excited to be giving my talk on Apple Intelligence live tomorrow at Deep Dish Swift – if you’re interested in tuning in to the conference stream, follow Deep Dish Swift on YouTube:
Check out Deep Dish Swift live and learn more about the conference.
In the process of switching my mounted overhead video setup from a backdrop bar to the Elgato Multi-Mount, I had to make one significant shift – filming upside-down, since the camera is now attached to the back of the desk instead of mounted above from the front. Unfortunately, that means all of my footage needs to be rotated before being usable in editing programs.
In Final Cut Pro for Mac, you can easily rotate clips once you’ve added them to the timeline. However, I’m not actively building a story yet, and I’m instead using the Browser to organize my footage into individual clips using in/out points and Favorites. For a long recording like an unboxing, I can turn an hour of footage into a full set of individual moments as clips, all timed exactly for the action, renamed as needed, and split apart as separate entities in the Browser.
This process and my footage means, by default, all my Browser clips are also upside-down, and at first glance this seemed like a big problem for my editing style – timeline editing is very different than clipping in the Browser, and I might be out of luck.
However, thanks to “2old2care” on Reddit (great username), the solution lies in the “Open Clip” menu option, which I’ve never used before:
Yes, you can invert the clip in the browser. Select the clip, then under “Clip” menu select “Open Clip”. You can then go to transform and rotate the clip 180º. I don’t know of a way to create a batch in FCP to do this, although it can be done for the original clips using Compressor.
To save myself the trouble of remembering later, I took screenshots of the process – here’s my setup in Organize mode (under Window > Workspaces > Organize):
As 2old2care mentioned, batch-processing files like this would be a more ideal solution – I’ll update this post if I find one.
Check out the source on Reddit, get the Multi-Mount from Elgato, and get Final Cut Pro for Mac from Apple.
I’ve just added a new folder to the Shortcuts Library — my set of Perplexity shortcuts for asking Perplexity to do research for you.
Use these to open the sections of the website, ask questions in new threads on iPhone and iPad, interact with the Mac app using keyboard shortcuts, go deeper on the Perplexity experience, and interact with the API:
Check out the folder of Perplexity shortcuts on the Shortcuts Library.
The first look at Personal Context for Apple Intelligence is here as APIs available in the iOS 18.4 developer betas allow apps to further their content for the system to understand. This sets the stage for the most significant update to Siri so far, where all your apps can provide Siri with the available views and content to work with – in a secure and private manner, too.
As first mentioned by Prathamesh Kowarkar on Mastodon, there is now a suite of APIs in beta that associate an app’s unique content, called an entity, with a specific view – this allows Siri to read what’s indexed on-screen and use it with other app’s actions when triggered by a command.
APIs like this are necessary for the coming Siri update to actually do what Apple says Apple Intelligence is capable of – now that the functionality is here, however, it’s up to developers to implement everything to make sure the experience works well.
Here are the new pages:
If these APIs are in beta now, it stands to reason they’ll leave beta after iOS 18.4 releases in full – which means Personal Context might be coming as early as iOS 18.4.
Check out the post from Kowarkar on Mastodon.
From the Apple Newsroom:
Apple today announced the new Mac Studio, the most powerful Mac ever made, featuring M4 Max and the new M3 Ultra chip. The ultimate pro desktop delivers groundbreaking pro performance, extensive connectivity now with Thunderbolt 5, and new capabilities in its compact and quiet design that can live right on a desk. Mac Studio can tackle the most intense workloads with its powerful CPU, Apple’s advanced graphics architecture, higher unified memory capacity, ultrafast SSD storage, and a faster and more efficient Neural Engine.
My M1 Mac mini from 2020 is also way overdue for an upgrade…
From the Apple Newsroom:
Apple today announced the new MacBook Air, featuring the blazing-fast performance of the M4 chip, up to 18 hours of battery life, a new 12MP Center Stage camera, and a lower starting price. It also offers support for up to two external displays in addition to the built-in display, 16GB of starting unified memory, and the incredible capabilities of macOS Sequoia with Apple Intelligence — all packed into its strikingly thin and light design that’s built to last.
I’ve been rocking the M1 MacBook Air from 2020, but it’s beyond time I upgraded…
New in iOS 18.4, Apple is making a new Food section available to Apple News+ subscribers, creating a curated browsing and recipe experience within the app. Located on iPhone under the Following tab and Food section, or in the Food section of the sidebar on iPad and macOS, this new category curates stories for you based on your chosen interests and browsing history, plus provides an entire Recipe Catalog and cooking experience for recipes with ingredients & instructions.
The entire experience for News+ Food is fantastic, albeit somewhat buried inside the News app – that’s why I’ve built a set of shortcuts to quickly access the sections from anywhere. In my folder of Apple News Food shortcuts, you can find shortcuts to access the main Food section, the Recipe Catalog, and two curated sections that are shown within the category for Healthy Eating and Kitchen Tools & Techniques.
You can use these with Siri, place them in a Medium widget, or even add them as Controls in Control Center or the Lock Screen – the Recipe Catalog would work great using Add to Home Screen as well, as Stephen Robles demonstrated in his video that highlights the Food feature.
So far, the News team at Apple has only ever created the Show Today Feed and Show Topic actions, and relied on the concept of “donations” (where an action only becomes available after the user interacts with a particular section) for sections like Magazines, Puzzles, and now the Recipe Catalog. Along this route, I’d love to see the Saved Recipes section available as a donated action, as well as being able to open directly to a saved recipe would make a lot of sense. But, going further, I wish the News team would adopt a full suite of actions like Get Recipes, Find Recipe, Save/Unsave Recipe, Cook Recipe, and Read The Story (for a recipe).
Get the folder of Apple News Food shortcuts in my Shortcuts Library (requires iOS 18.4).
I’ve just added a new folder to the Shortcuts Library — my set of Apple News Food shortcuts for the new Food section in Apple News+, available in iOS 18.4.
Use these shortcuts to browse stories from the Food, Healthy Eating, and Kitchen Tools & Techniques sections, as well as open directly to the Recipe Catalog.:
Check out the folder of Apple News Food shortcuts on the Shortcuts Library.
From Stephen Robles on YouTube:
First beta of iOS 18.4 is here and it brings some exciting new features including ambient music controls, Apple News recipes, and a lot more. Plus, we dive into a feature that might hint at Apple’s upcoming smart home device.
I’ve linked directly to the chapter on the new Shortcuts actions ?
Framous is a new design tool for Mac from developer and podcaster Charlie Chapman aimed at making it easy to wrap iPhone, iPad, Apple Watch, and Mac screenshots in device frames, turning a boring rectangular screenshot into a rich preview of what that screen would actually look like on a real device.
Framous is built to auto-detect the device, let you automatically combine several images with custom spacing, or bulk-export multiple images at once – the latter clearly being helpful for app developers who are trying to design images for the App Store. But using device frames isn’t just for developers – anyone sharing their screenshots can clean them up and make a much nicer presentation by processing their screenshots through Framous.
Thankfully, Framous makes the process of framing your screenshots even easier thanks to Shortcuts support, with an action that lets you pass screenshots in, choose spacing, and pass out device-wrapped images as a result. Over on Bluesky, Charlie shared an example shortcut that takes advantage of Shortcuts for Mac’s Quick Action functionality, which lets you select files directly from Finder, run the shortcut, and replace the files with updated assets inline, placing the new files alongside the original:
Whoa ? Now with this basic Shortcut setup I can select a couple screenshots in Finder, hit ctrl+opt+cmd+F, and it’ll prompt for a save location and create framed screenshots for each and combine into a single image (with custom padding)
— Charlie Chapman (@charliemchapman.com) February 17, 2025 at 7:45 PM
[image or embed]
Merging screenshots with devices has long been a part of Shortcuts’ history, with Federico Vittici of MacStories releasing a regularly-updated Apple Frames shortcut (for free) that performs a similar operation using Shortcuts’ Scripting actions. I’ve almost always taken Federico’s shortcut and modified it for my own needs, adding a Combine Images step and separating out the spacing according to my style – exactly what Framous adds on top of the strongly-proven use case.
While Framous is Mac-only, it provides complete UI with features like drag-and-drop makes the task much more approachable to an everyday Mac user. Plus, having a native action built into Shortcuts gives Framous the ability to change things like spacing or how to handle multiple devices as a simple parameter on the action itself, rather than building menus or prompts into a custom shortcut. That being said, I’d also love to see Framous continue to strengthen the Shortcuts support and features like controls for the generic frames.
Other features of Framous include customizing the generic device frames to show or hide side buttons, move their placement, or control the camera cutout. You can also toggle whether the screenshot fills edge-to-edge, change the frame color, and adjust the corner radius, plus scale up the image for lower-resolution screenshots.
Get Framous on the Mac App Store for free, with in-app purchases. The free download comes with generic frames – to unlock more, a one-time fee of $19.99 gets you all frames released in (and up to) 2025. If you want all frames as they come out in the future, you can subscribe for $9.99 per year.
I’ve added two more products to my desk setup today thanks to my Elgato partnership – a pair of the CamLink 4K, the HDMI-to-USB converter for dedicated cameras.
A few years ago, I bought a second Panasonic GH-5 to go with my first camera, allowing me to have both a dedicated A-Roll and top-down camera set up at all times. For the longest time since then, I’ve relied on the USB-C port and the LUMIX Tether app to bring in the feeds for both cameras – however, it required launching the tether app, doing a special combination of previewing and minimizing the app to get the full-quality feed, and then using that footage for my streams.
In the end, the setup process was too finicky, the frame rate wasn’t ideal once I started loading more devices onto the USB chain, and, at some point, macOS decided to stop recognizing the cameras as two separate devices and thought they were the same camera, leading to more errors and making it impossible to actually pull both feeds in at high quality.
So I started back where I began, with a CamLink 4K – the first Elgato product I bought back in the day, for this exact purpose (at some point I accidentally bent the port…).
Now, with one for each camera, I can speak to the camera while showing what I’m doing on my iPad, iPhone, or other devices at the same time. When I had just one camera, I recorded YouTube videos by speaking the camera, then recording the top-down footage timed to what I’d said; when I had two, I could record it all, but not preview it in real time while still using my Mac – now I can do everything at once.
My thanks to Elgato for sending me the CamLink 4K set – you can check out more on the Elgato website or my Elgato tag.