AI progress isn’t only about advancing intelligence—it’s about unlocking it through interfaces that understand context, adapt to your intent, and work seamlessly. That’s why we’re excited to share that OpenAI has acquired Software Applications Incorporated, makers of Sky.
And:
“We’ve always wanted computers to be more empowering, customizable, and intuitive. With LLMs, we can finally put the pieces together. That’s why we built Sky, an AI experience that floats over your desktop to help you think and create. We’re thrilled to join OpenAI to bring that vision to hundreds of millions of people.” —Ari Weinstein, Co-Founder and CEO, Software Applications Incorporated
Incredible run by my former teammates – first selling Workflow to Apple, and now Sky to OpenAI.
I’m super excited to see their deep talent and passion reflected in the ChatGPT app.
Accidentally wrote the perfect tweet – a technically-true point that can be wildly interpreted, which went viral on Twitter (and of course Grok got slightly wrong):
Former Apple engineer1 Matthew Cassinelli disclosed that the company conducts much of its internal operations using its own iWork apps, including Calendar, Contacts, Pages, Numbers, and Keynote, as a key aspect of its dogfooding practice. This revelation, shared on X and garnering over 15,000 engagements, highlights Apple’s commitment to testing products internally to drive improvements, with proprietary backend tools enhancing functionality for its needs. While users praised the apps’ integration and usability in areas like Keynote and Pages, criticisms focused on Contacts’ cumbersome interface and the occasional reliance on tools like Excel for complex tasks.
A few clarifications (you can find in more detail in the replies):
I both used and enjoyed these apps before I joined, but I had never seen a whole company committed to them.
Apple does not necessarily “conduct much of its internal operations” within iWork – they all have the apps and use them, but there are many other tools in place.
Employees are given these apps and use them by default, but do not exclusively use these tools; the people in Finance use Excel, for example, and anything specialized like CAD is also used – in addition to iWork.
Contacts and Calendar specifically are buoyed by an internal directory; Mail also has server-side rules that make filtering easy for them and not in the product.
I worked at Apple in 2017, so this is outdated – still true at a base level, but they’ve adopted more advanced tools like Slack since then.
On a basic level, Apple provides the apps because they make the apps, and it wouldn’t make sense to pay for a second set of tools for every employee while also not using your own freely-available product.
Plus, while you’re here – if you’re ever running into speed problems with apps like Contacts or Calendar, you should look into Shortcuts. For example, the new Use Model action for Apple Intelligence makes tasks like processing contact information much easier to build within a few steps.
For what it’s worth, I was a Product Specialist and not an engineer – I studied Business Administration and Marketing before joining Workflow. I became a programmer because of Workflow (now Shortcuts), but I don’t want to misrepresent myself as a former Apple engineer. ↩
Feel free to repost your own wildly-misinterpreted version of my point so I can hit Creator Monetization and get paid. ↩
On iPhone 17 models, Apple has added new hardware and software updates for advanced Camera features like Dual Capture and Center Stage, which allow capturing footage in more dynamic ways than ever.
Quickly accessing new features like this and forming muscle memory is critical to user adoption & long-term habits, which is why Apple should expand the Camera app’s Shortcuts support to everything new – something I’ve requested directly via the Feedback app in issue FB20772988 (Add Dual Capture and Selfie Video to Camera actions in Shortcuts).
Dual Capture and Selfie Rotate on iPhone 17
With any iPhone 17, you’re now able to capture both front-facing and rear-facing footage at the same time in a Dual Capture experience. This an awesome merging of hardware and software that creates a personal capturing experience I’ve loved since the Frontback days – a memory that says “here’s where I am”, but also “here’s who I am” (and “here’s who I’m with” too).
Plus, the selfie sensor has been expanded to a square size to allow both portrait and landscape capture, enabling features like a Selfie Rotate button to shoot in landscape while holding the phone vertically,1 as well as Center Stage functionality that automatically expands the shot depending on how many people are paying attention in-frame.
On The Stalman Podcast, Apple iPhone Product Manager Megan Nash specifically mentioned that holding the phone vertically created better eye gaze, which is otherwise awkward and often prevents people like me from filming themselves:
“You’ll notice people in the photos have better eye gaze because the camera preview is centered with the front camera, rather than being off to the side when you rotate iPhone to horizontal.”
These are incredible additions to the lineup and the primary reason I was excited to upgrade this year, both of which will make everyday content creation easier and also more dynamic.
Expand Camera’s App Shortcuts Support
I’m proposing that Apple add these features into the Camera app’s Shortcuts support, either in the form of expanded App Shortcuts or an overhaul to the Camera actions.
Currently, in Shortcuts, the Camera app has a single action, Open Camera, that opens the camera in a specified mode. As of writing, you’re able to choose from Selfie, Video, Portrait, Portrait Selfie, Photo, Cinematic, Slo-Mo, Time-Lapse, Pano, Spatial Video, and Spatial Photo.
Crude rendering by yours truly.
The simplest update would be to include options for Dual Capture and Landscape Selfies, allowing a quick addition to existing functionality. This would build upon the curated App Shortcuts experience, and make these new features immediately available via Siri, on the Lock Screen, in Control Center, and on the Action button nicely – the simplest and most likely outcome.
Overhaul Camera’s App Intents Support
However, I propose Apple give the Camera app a deeper App Intents review and consider splitting up the Open Camera action in alignment with the Camera app redesign, building out the longstanding Take Video and Take Photo actions from Workflow and including additional functionality as parameters.
Take Video could include modes (and App Shortcuts) for Video, Cinematic, Slo-Mo, and Timelapse, each with dependent parameters for front-/rear-facing cameras, zoom levels and rotate options, extra features, and video formats. Take Photo could include modes (and App Shortcuts) for Photo, Selfie, Portrait, Spatial, Pano, with the same additional functionality as parameters for each mode2
Adding both options as separate actions would deliver add long-desired functionality to the Camera apps’ existing actions and enable a wide array of creator-focused shortcuts based on hyper-specific shooting modes. Plus, these actions could still be turned into App Shortcuts, enabling everyday users to quickly access Dual Capture or landscape-in-portrait selfies on their new iPhone 17 as needed.
Apple – please make it easier to take landscape selfies!3
If you want to see this update, please duplicate my report4 in the Feedback app to signal to Apple that multiple users want this changed.
FYI according to the Alt Text on the Apple Support website, it is officially called “the Selfie Rotate button.” ↩
There may need to be some slight fudging of “modes” to make a pleasant App Shortcuts experience here, otherwise having both “normal,” “Selfie,” and “Landscape Selfie” versions of each as additional options might be too much – I can see why they might’ve chosen to avoid this route originally. That being said, they should go further with more actions rather than pulling back. ↩︎
There’s got to be a better way to say “enabling landscape selfies while holding iPhone vertically” (from 3:35) – I propose “landscape selfie” as the generic term. ↩︎
On iPhone 17, new Camera modes like Dual Capture and Selfie Rotate let users record from both cameras or film landscape selfies while holding iPhone vertically. These features aren’t available in Shortcuts or App Shortcuts, making them harder to access quickly.The simplest improvement would be adding Dual Capture and Selfie Video options to the existing Open Camera action. Longer term, Camera could gain full App Intents support by splitting Open Camera into Take Photo and Take Video actions with parameters for mode, camera, and format.Results Expected:I am expecting to find all Camera functionality, including Dual Capture, Selfie Video, and future modes, available in the Shortcuts app or App Shortcuts experiences for use from the Lock Screen, Control Center, or Action button. ↩
“I mean, that’s the bottom line is it’s a great idea. And like I said about Misty Studio1, all things considered, it does a pretty good job of being kind of a friendly face to building an AI model, but in the end, it’s like Shortcuts in that it’s not really that friendly.”
Fair enough – if it truly was, I’d have been out of a job for a long time.
For reference, they explained Misty Studio earlier: > “Misty Studio is a demo that Apple did for the M5. Misty Studio runs an open-source model locally” ↩
P.S. I apologize in advance to Jason for the URL slug 🙂↩
With the release of iOS 26, iPadOS 26, and macOS 26 this month, developers around the world are able to bring even more intelligent experiences right into their apps by tapping into the on-device large language model at the core of Apple Intelligence.
The Foundation Models framework allows developers to create new intelligence features that protect users’ privacy and are available offline, all while using AI inference that is free of cost.
You can now add intelligence to your apps for free on Apple platforms – and while it’s relatively simple today… that’s only for now.
App Intents are how Apple devices understand and interact with your app. They’re the foundation of features of like Shortcuts, Siri, Spotlight – and now Apple Intelligence. If you want your app to take advantage of the deepest parts of the Apple ecosystem, it starts with App Intents.
Following recent projects with Foodnoms, MindNode, and Tripsy, I now have availability for App Intents consulting this fall and into 2026. In addition to full start-to-finish projects, I’m introducing new flexible options:
Audits: a focused review of your existing intents, data models, and opportunities
Docs-only: structured documentation you can use with your team to implement directly
If you want Apple Intelligence to understand your app’s core features, or you want to deploy your app across the system to make a cohesive experience, I can help you design and deliver the following:
The unreleased Actions and Context portions of Apple Intelligence
App Intents, App Entities, and App Enums for your app
Automatically-generated instances of important intents as App Shortcuts
Spotlight, Siri, and Controls integrations
Custom Shortcuts to be distributed to users
Documentation on the new offerings
Share ongoing updates in a developer newsletter
Each engagement starts with a free, 1-hour call to asses your needs, discuss budgets and rates, and outline next steps – whether you’re working on a brand, part of a team, or an indie developer, we can find a solution that works for you.
You can learn more about my services, explore past client work, and watch my conference talks on my Consulting. If you’re ready to move forward, book a call with me directly to get started.
Let’s make your app one of the best citizens of the Apple ecosystem – ready for Apple Intelligence, Shortcuts, and beyond.
Use these to ask Apple’s on-device or Private Cloud Compute models, talk to ChatGPT, utilize Writing Tools, generate images with Image Playground, and create Memories in Photos.
Use Model: Allows you to enter a request, asks which model to prompt, then lets you ask Follow Up questions, and shows you the final response.
Pass through Writing Tools: For a given input, asks you to describe your change – then, creates a summary, key points, list, and table, plus proofreads, rewrites, and adjusts the tone. Produces Markdown-ready text, complete with auto-generated title.
Create with Image Playground: Asks you to describe an image or takes an image from input, then to choose an art style, then creates an image and shows it to you (plus saves it to Image Playground).
Create Memory in Photos: Asks you to describe a memory to create, then uses Apple Intelligence and the Photos app to generate a Memory for you.
New in iOS 26, iPadOS 26, macOS 26, watchOS 26, and visionOS 26
This update includes enhancements to the Shortcuts app across all platforms, including new intelligent actions and an improved editing experience. Shortcuts on macOS now supports personal automations that can be triggered based on events such as time of day or when you take actions like saving a file to a folder, as well as new integrations with Control Center and Spotlight.
New Actions (Editor’s note: shortened for sake of space)
Freeform
Image Playground, requires Apple Intelligence*
Mail
Measure
Messages
Screen Time
Sports
Photos
Reminders
Stocks
Use Model, requires Apple Intelligence*
Visual Intelligence, requires Apple Intelligence*
Voice Memos
Weather
Writing Tools, requires Apple Intelligence*
Updated Actions
For those building custom shortcuts, some actions have been updated:
“Calculate Expression” can now evaluate expressions that include units, including real time currency conversion rates, temperature, distance, and more
“Create QR Code” can now specify colors and styling
“Date” can now specify a holiday
“Find Contacts” can now filter by relationship
”Transcribe Audio” performance has been improved
“Show Content” can now display scrollable lists of items, like calendar events, reminders, and more
Shortcut Editor
For those building custom shortcuts, updates have been made to the shortcut editor:
Improved drag and drop and variable selection
Over 100 new icon glyphs are now available, including new shapes, transportation symbols, and more
Rich previews of calendar events, reminders, and more
The ability to choose whether shortcuts appear in Spotlight Search
macOS Improvements
Spotlight
Shortcuts can now accept input, like selected text from an open document, when being run from Spotlight.
Automations
Shortcuts can now be run automatically based on the following triggers:
Time of Day (“At 8:00 AM, weekdays”)
Alarm (“When my alarm is stopped”)
Email (“When I get an email from Jane”)
Message (“When I get a message from Mom”)
Folder (“When files are added to my Documents folder”)
File (“When my file is modified”)
External Drive (“When my external drive connects”)
Search and Take Action with Updates to Visual Intelligence
Visual intelligence, which builds on Apple Intelligence, now helps users learn and do more with the content on their iPhone screen. It makes it faster than ever for users to search, take action, and answer questions about the content they’re viewing across their apps.
Users can search for the content on their iPhone screen to find similar images across Google, as well as apps that integrate this experience, such as eBay, Poshmark, Etsy, and more. If there’s an object a user is interested in learning about, like a pair of shoes, they can simply press the same buttons used to take a screenshot and highlight it to search for that specific item or similar objects online. And with ChatGPT, users can ask questions about anything they’re viewing onscreen.
Continue playback of video: Visual Intelligence on iPhone 17 Pro
Updates to visual intelligence help users learn and do more with the content on their iPhone screen.
Visual intelligence enables users to summarize and translate text, as well as add an event from a flyer on their iPhone screen to their calendar, with a single tap.
Users can also take advantage of these capabilities by using visual intelligence with their iPhone camera through Camera Control, the Action button, and in Control Center.
And:
Build Intelligent Shortcuts
Shortcuts help users accomplish more faster, by combining multiple steps from their favorite apps into powerful, personal automations. And now with Apple Intelligence, users can take advantage of intelligent actions in the Shortcuts app to create automations, like summarizing text with Writing Tools or creating images with Image Playground.
Users can tap into Apple Intelligence models, either on device or with Private Cloud Compute to generate responses that feed into the rest of their shortcut, maintaining the privacy of information used in the shortcut. For example, users can create powerful Shortcuts like comparing an audio transcription to typed notes, summarizing documents by their contents, extracting information from a PDF and adding key details to a spreadsheet, and more.
In my latest addition to the Shortcuts Library, I updated my shortcuts for the TV app with expanded functionality, including many new functions, redesigned menus, and, critically, support for macOS.
I accomplished Mac support using Shell Scripting, a technique which I’m sharing for members:
Opening deep links into Mac apps
In order to open a URL into an existing application, the best method within Shortcuts is using the Run Shell Script action, which allows you to execute command-line utilities using shell scripts, similar to the Terminal.
In order to open any app using a shell script, the following command can be used – here’s an example for the TV app:
open -a TV
In order to open a URL within that app, one only needs to add the URL afterwards, like so:
open -a TV https://tv.apple.com/watch-now
In my shortcuts, I’ve used the Deep Link variable to pass in the appropriate URL for the TV app into the shell script on-demand – only if run from Mac.
This works great with my TV app shortcuts, which utilizes preset versions of the URLs from the tv.apple.com website (as well as web scraping from the Apple TV Marketing Toolbox).
On desktop, these links will activate and open the TV app for Mac – on mobile, they’ll go straight into the TV app for iPhone and iPad.
Add to Dock and Open URLs
I’ve also used this method to open links from web services like Reddit into desktop versions of the app, which I’ve created for a handful of social media apps with the “Add to Dock” feature added to macOS in recent years.
I can also see this technique being useful for native versions of web apps like Notion or Airtable – I’ll have to explore more there soon.
These are redesigned for my new approach to building shortcuts, which is less targeted at separate actions and includes a more-bundled approach – each shortcut provides more functionality in a targeted area.
My favorite is the new Watchlist shortcut – I’ve been working on a version of this for the year or so! Enjoy:
Open into the TV app: Presents a menu of sections in the TV app and opens the deep link into the app on iPhone, iPad, and Mac – options include Home, Search, Store, Sports, Apple TV+, and Library. When run from Apple Watch, opens the Apple TV app.
Add to my TV watchlist: Accepts a list of TV shows or movies, scrapes the results from Apple’s Marketing Toolbox, and lets you pick where to send the media – with options to open into the TV app, add to your Watchlist, send to Reminders, or copy the links.
Open sports in the TV app: Presents a menu of Sports sections available in the TV app, include overall Sports, plus MLS Season Pass and Major League Baseball, as well a dedicated section for your favorite home team.
Browse the TV Store: Presents a menu for opening into the TV app to the Store section, either directly using a deep link, using the iTunes actions in Shortcuts, or Apple’s RSS feeds for top movie and TV content – plus categories for dedicated “rooms” in the TV app for special content.
Open from Apple TV Plus: Presents menu options for opening into the Home, Shows, Movies, and Upcoming sections of Apple TV+ in the TV app, plus categories for genres.
My friends at Elgato have updated the Stream Deck app for Mac for version 7.0 with new features for creating virtual Stream Decks on your computer, key logic so each key have multi-tap abilities, weather updates in a new plugin, and quality-of-life features like showing whether an app is currently open.
Here’s how they describe the updates:
🎛️ Virtual Stream Deck — your on-screen workspace controller
Create unlimited virtual keys, customize actions and layouts, then pin them in place or summon to your cursor. It’s your OS sidekick, making every workflow fast and effortless. It’s Stream Deck on your computer, anywhere you go.
[…]
👇 Key Logic — multi-tap abilities
Assign up to three different actions to a single key using Key Logic. Perform a unique action based on how the key is pressed:
Press
Double press
Press and hold
For example, press to play/pause music, double press to skip tracks, or press and hold to go to the previous track.
[…]
⛅ Weather plugin – stay ahead of the forecast
The new Weather Plugin for Stream Deck puts live weather updates and forecasts at your fingertips, with minimal setup and configuration. Instantly see the sky’s latest mood and plan your day without ever picking up your phone or opening a browser.
[…]
🛠️ Improvements and bug fixes
The Open Application action now displays a green dot when the selected app is running.
You can now configure the Open Application action to either do nothing, close, or force quit the selected app when long-pressed.[…]
In iOS 26, Apple is adding a series of exciting new actions to Shortcuts, with a heavy focus on Apple Intelligence including direct access to their Foundation Models with the new Use Model action.
Alongside that, Apple has actions for Writing Tools, Image Playground, and Visual Intelligence, plus the ability to Add Files to Freeform and Notes, Export in Background from the iWork apps, and new Find Conversation & Find Messages actions for the Messages app, among others.
Plus, new updates to current actions—like turning Show Result into Show Content—make existing functionality easier to understand.
Here’s everything that’s new – available now in Public Beta:
Apple Intelligence
The major focus of actions in iOS 26 is access to Apple Intelligence, both directly from the Foundation Models and indirectly through pre-built Writing Tools actions and Image Playground actions – plus a simple “Open to Visual Intelligence” action that seems perfectly suited for the Action button.
Use Model
Use Model
Private Cloud Compute
Offline
ChatGPT Extension
Writing Tools
Make Table from Text
Make List from Text
Adjust Tone of Text
Proofread Text
Make Text Concise
Rewrite Text
Summarize Text
Visual Intelligence
Open to Visual Intelligence
Image Playground
Create Image
Actions
Apple has added new actions for system apps and features, starting with an interesting Search action that pulls in a set number of results, similar to Spotlight.
Both Freeform and Notes got “Add File” actions, plus you can add directly to checklists in Notes now too. Apple put the Background Tasks to work with exporting from iWork apps, and nice-to-have actions for Sports, Photos, and Weather make it easier to take advantage of those apps.
Particularly nice is Find Conversations and Find Messages, the former of which works well with Open Conversation, and the latter of which is a powerful search tool.
Search
Search
Freeform
Add File to Freeform
Notes
Add File to Notes
Append Checklist Item
iWork
Export Spreadsheet in Background
Export Document in Background
Export Presentation in Background
Documents
Convert to USDZ
Sports
Get Upcoming Sports Events
Photos
Create Memory Movie
Messages
Find Conversations
Find Messages
Weather
Add Location to List
Remove Location from List
Updated
Apple continues to make Shortcuts actions easier to understand and adopt for new users, making small tweaks like clarifying Show Content and Repeat with Each Item.
Plus, existing actions like Calculate Expression, Translate, and Transcribe have benefitted from system-level improvements:
Show Result is now titled Show Content
Repeat with Each is now labeled “Repeat with Each Item” once placed
Open Board for Freeform now shows as App Shortcuts
Calculate Expression can accept real-time currency data
Translate has been improved
Transcribe has been improved
“Use Search as Input” added to Shortcut Input
Coming this Fall
These new actions are available now in Public Beta—install at your own risk—and will be fully available in the fall once iOS 26 releases.
There are also further improvements on the Mac, which gained Automations in Shortcuts—including unique File, Folder, and Drive automations only available on Mac—plus the ability to run actions directly in Spotlight. I’ll cover these in future stories – be sure check the features out if you’re on the betas.
I will update this post if any more actions are discovered in future betas, or if there’s anything I’ve missed here.
“The object of this logic and word puzzle is to complete several phrases with as few moves as possible. Each emoji may be interpreted directly, through association, or in combination with other emoji. When you attempt an answer or expand a clue, it counts as a move.”
You can also do any of the following:
Try an answer: Consider the various definitions or associations for the emoji, then drag the most appropriate emoji (or group of emoji) to complete each word or phrase.For example, a “pear” emoji 🍐 could complete “DISAP_ _ _ _,” but interpreted as “fruit” it could complete “_ _ _ _ _ FUL.”Letters relating to emoji may appear nonconsecutively in an answer. For example, dragging an “earth” emoji 🌍 to “L_ _ _N _ _E ROPES” completes the phrase “learn the ropes.”Interpret grouped emoji as a whole. For example, a single 🐠 could mean “fish,” while 🐠🐠🐠 might mean “school.”
Expand a clue: Tap [Eye icon]. This counts as a move.
Reveal answers: Tap [Three Dots icon], then tap Reveal. The answers you didn’t find are shown. The puzzle doesn’t count in your Scoreboard stats and streaks.
Today, Apple News+ debuted Emoji Game, an original puzzle that challenges subscribers to use emoji to complete short phrases. Emoji Game is now available in English for Apple News+ subscribers in the U.S. and Canada.
“Emoji Game is the perfect addition to the Apple News+ suite of word and number puzzles, turning the emoji we use every day into a brainteaser that’s approachable and fun,” said Lauren Kern, editor-in-chief of Apple News.
At WWDC25, Apple expanded access to their Foundation Models to third-party developers, making intelligence features easier to implement while maintaining privacy.
With the framework, developers are able to access local, on-device models from Apple, make requests to Private Cloud Compute when needed, and can readily adopt tools like the Vision framework or SpeechAnalyzer.
After announcing updates at WWDC, Apple released four new developer sessions directly related to App Intents—the API that lets Apple Intelligence understand and interact with apps—following up on sessions from years past.
Here are this year’s sessions – in my recommended viewing order:
Start with the summary of the API, see what’s new this year, learn the most relevant ways users will interact with your app, and then take a look at advances in snippets – in 1 1/2 hours of focused viewing.
App Intents lets developers deeply integrate their app’s actions and content with system experiences across platforms, including Siri, Spotlight, widgets, controls, and more.
This year, App Intents gains support for visual intelligence. This enables apps to provide visual search results within the visual intelligence experience, allowing users to go directly into the app from those results. For instance, Etsy is leveraging visual intelligence to enhance the user experience in its iOS app by facilitating faster and more intuitive discovery of goods and products.
“At Etsy, our job is to seamlessly connect shoppers with creative entrepreneurs around the world who offer extraordinary items — many of which are hard to describe. The ability to meet shoppers right on their iPhone with visual intelligence is a meaningful unlock, and makes it easier than ever for buyers to quickly discover exactly what they’re looking for while directly supporting small businesses,” said Etsy CTO Rafe Colburn.
From my stream with Rudrank Riyam on YouTube Live — tune in:
This week, developer Rudrank Riyam, author of the AiOS Dispatch newsletter, joins me to talk about his experiences developing for Apple platforms using AI-assisted coding, and what we’re interested in ahead of Apple’s Worldwide Developer Conference (WWDC).