They discussed how Gurman got started covering Apple, the stories resurfacing this week around the Siri update being powered by Google’s Gemini, and the iPhone 17 lineup.
David Pierce, host of The Vergecast often complains that Shortcuts is too complicated and not useful. Equally as often, I tell him he’s wrong on social media, but this time I got to do it live! My thanks to David for inviting me on The Vergecast, and I’m pretty sure I won this round.
If you haven’t heard of TBPN, the “Technology Brothers Podcast Network” is an increasingly popular show by hosts Jordi Hayes and John Coogan that covers the major news of the day in the technology and business world – almost like a CNN for Silicon Valley. Streaming live from 11 AM – 2 PM PST every weekday, TBPN is known for high-profile guest interviews, clippable moments shared on social media, and a somewhat-irreverent tone paired with a deep knowledge & passion for the space.
I’m a fan of the way TBPN has given a breath of fresh air to technology coverage, simultaneously innovating on top of cable TV news, video & audio podcasts, and livestream formats in a new media organization for the current era. When things like AI are changing within a single week, the show provides a spotlight for understanding what’s going on as things move so quickly – and they demonstrate a better grasp of how to spread their content than any organization I’ve seen lately. The show has even evolved into a de-facto part of the technology media circuit, where having your startup’s news broken on TBPN is an indicator of success (much like getting coverage on TechCrunch).
With the show’s 3-hour runtime and multiple formats, it’s reasonable that an average listener won’t always engage with the entirety of each show, so it sure would helpful if there was some sort of way to access everything as needed… like a shortcut perhaps.
Me being me, I built a folder of shortcuts for TBPN for Apple’s Shortcuts app. These shortcuts let you listen to the show on Apple Podcasts and Spotify, watch the livestream or put the feed up on your TV, plus follow the team on X. Plus, as you’ll soon learn, the show is heavily sponsored by Ramp, so I created a cheeky shortcut for anyone to learn more from their website.
I also wrote a blog post about the technique for opening the TBPN livestream, which involves adding /live to any YouTube channel URL.
If you’re someone who enjoys watching livestreams on YouTube, you may not know about the permanent redirect for every YouTube channel that takes you straight to their current livestream or recent streams – just add/live to the channel URL.
I recently published a set of shortcuts for TBPN, the tech & business news podcast, which I like to watch live on YouTube occasionally – the shortcut “Watch TBPN Live” uses the /live redirect to the show. In this context, having one URL for both a live show or capturing a recent stream is ideal, because I can jump straight to the full shows – oftentimes the Home tab of YouTube channels are filled with clips or playlists, and this makes it easy to get straight to the latest full streams of any live video podcast.
My YouTube channel URL is https://www.youtube.com/@matthewcassinelli, and the URL for the Live page is technically https://www.youtube.com/@matthewcassinelli/streams. However, adding /live to the channel URL—https://www.youtube.com/@matthewcassinelli/live—creates a redirect that goes to that same streams page when I’m not live – or directly to the current livestream when I’m live on-air.
I love using this when directing people towards my own livestreams, because it provides a single, clean permalink that never changes and can be used in any social media post that’s written ahead of the stream. Once the show is over, I can use the actual video permalink to share the episode with other people, but ahead of time this single /live redirect is ideal for promotion.
Plus, since I can’t help myself, I built a shortcut for livestreams on YouTube that take advantage of this exact capability. My “Open livestreams for this channel” shortcut lets you take any current video URL, scrape the channel URL from its metadata, and redirect you to the livestreams page of that channel – so you can see their latest streams and even tune in immediately if they are live now. Try calling up the shortcut using Type to Siri next time you’re watching a video and check out if the channel does any livestreaming.
Use these shortcuts to watch the video podcast, tune into episodes on the go, and find the show on X – plus check out Ramp, of course:
Watch TBPN TV: Takes the RSS feed for TBPuTube channel and opens the most recent item. Includes option to AirPlay to the Apple TV, or opens in full-screen on Mac.
Watch TBPN Live: Opens the /live URL of TBPN’s YouTube channel, which redirects to either the current livestream or the page of recent streams.
Play TBPN on Apple Podcasts: Finds and plays the latest episode of TBPN in the Apple Podcasts app. Also follows the show if you’re not already.
One of the things that I think about from time to time is Apple’s collection of apps. Some are the crown jewels, like Apple’s pro apps, and others help an everyday consumer to tackle their iLife. All are pretty starved for attention and resources, outside of infrequent updates aligned with showing off the native power of Apple Silicon, Apple Intelligence, or demos of platform integration that never quite get all the way there.
Three things really brought this up to the surface for me recently: The neglect of Clips and iMovie, the radio silence regarding Pixelmator/Photomator, and Final Cut Pro being trotted out for demos but not shipping appropriate updates.
I agree with Joe’s sentiment, but direct it more towards—you guessed it—the Shortcuts app than Pixelmator, which I’ve been saying is within a reasonable window for updating – anything they’re working on could only feasibly ship after en entire yearly cycle.
Shortcuts, on the other hand, has been out for over 5 years and still hasn’t evolved too far beyond its original Workflow UX – Six Colors’ own Jason Snell just talked about how Shortcuts is not really that friendly on Monday of this week.
AI progress isn’t only about advancing intelligence—it’s about unlocking it through interfaces that understand context, adapt to your intent, and work seamlessly. That’s why we’re excited to share that OpenAI has acquired Software Applications Incorporated, makers of Sky.
And:
“We’ve always wanted computers to be more empowering, customizable, and intuitive. With LLMs, we can finally put the pieces together. That’s why we built Sky, an AI experience that floats over your desktop to help you think and create. We’re thrilled to join OpenAI to bring that vision to hundreds of millions of people.” —Ari Weinstein, Co-Founder and CEO, Software Applications Incorporated
Incredible run by my former teammates – first selling Workflow to Apple, and now Sky to OpenAI.
I’m super excited to see their deep talent and passion reflected in the ChatGPT app.
Accidentally wrote the perfect tweet – a technically-true point that can be wildly interpreted, which went viral on Twitter (and of course Grok got slightly wrong):
Former Apple engineer1 Matthew Cassinelli disclosed that the company conducts much of its internal operations using its own iWork apps, including Calendar, Contacts, Pages, Numbers, and Keynote, as a key aspect of its dogfooding practice. This revelation, shared on X and garnering over 15,000 engagements, highlights Apple’s commitment to testing products internally to drive improvements, with proprietary backend tools enhancing functionality for its needs. While users praised the apps’ integration and usability in areas like Keynote and Pages, criticisms focused on Contacts’ cumbersome interface and the occasional reliance on tools like Excel for complex tasks.
A few clarifications (you can find in more detail in the replies):
I both used and enjoyed these apps before I joined, but I had never seen a whole company committed to them.
Apple does not necessarily “conduct much of its internal operations” within iWork – they all have the apps and use them, but there are many other tools in place.
Employees are given these apps and use them by default, but do not exclusively use these tools; the people in Finance use Excel, for example, and anything specialized like CAD is also used – in addition to iWork.
Contacts and Calendar specifically are buoyed by an internal directory; Mail also has server-side rules that make filtering easy for them and not in the product.
I worked at Apple in 2017, so this is outdated – still true at a base level, but they’ve adopted more advanced tools like Slack since then.
On a basic level, Apple provides the apps because they make the apps, and it wouldn’t make sense to pay for a second set of tools for every employee while also not using your own freely-available product.
Plus, while you’re here – if you’re ever running into speed problems with apps like Contacts or Calendar, you should look into Shortcuts. For example, the new Use Model action for Apple Intelligence makes tasks like processing contact information much easier to build within a few steps.
For what it’s worth, I was a Product Specialist and not an engineer – I studied Business Administration and Marketing before joining Workflow. I became a programmer because of Workflow (now Shortcuts), but I don’t want to misrepresent myself as a former Apple engineer. ↩
Feel free to repost your own wildly-misinterpreted version of my point so I can hit Creator Monetization and get paid. ↩
On iPhone 17 models, Apple has added new hardware and software updates for advanced Camera features like Dual Capture and Center Stage, which allow capturing footage in more dynamic ways than ever.
Quickly accessing new features like this and forming muscle memory is critical to user adoption & long-term habits, which is why Apple should expand the Camera app’s Shortcuts support to everything new – something I’ve requested directly via the Feedback app in issue FB20772988 (Add Dual Capture and Selfie Video to Camera actions in Shortcuts).
Dual Capture and Selfie Rotate on iPhone 17
With any iPhone 17, you’re now able to capture both front-facing and rear-facing footage at the same time in a Dual Capture experience. This an awesome merging of hardware and software that creates a personal capturing experience I’ve loved since the Frontback days – a memory that says “here’s where I am”, but also “here’s who I am” (and “here’s who I’m with” too).
Plus, the selfie sensor has been expanded to a square size to allow both portrait and landscape capture, enabling features like a Selfie Rotate button to shoot in landscape while holding the phone vertically,1 as well as Center Stage functionality that automatically expands the shot depending on how many people are paying attention in-frame.
On The Stalman Podcast, Apple iPhone Product Manager Megan Nash specifically mentioned that holding the phone vertically created better eye gaze, which is otherwise awkward and often prevents people like me from filming themselves:
“You’ll notice people in the photos have better eye gaze because the camera preview is centered with the front camera, rather than being off to the side when you rotate iPhone to horizontal.”
These are incredible additions to the lineup and the primary reason I was excited to upgrade this year, both of which will make everyday content creation easier and also more dynamic.
Expand Camera’s App Shortcuts Support
I’m proposing that Apple add these features into the Camera app’s Shortcuts support, either in the form of expanded App Shortcuts or an overhaul to the Camera actions.
Currently, in Shortcuts, the Camera app has a single action, Open Camera, that opens the camera in a specified mode. As of writing, you’re able to choose from Selfie, Video, Portrait, Portrait Selfie, Photo, Cinematic, Slo-Mo, Time-Lapse, Pano, Spatial Video, and Spatial Photo.
Crude rendering by yours truly.
The simplest update would be to include options for Dual Capture and Landscape Selfies, allowing a quick addition to existing functionality. This would build upon the curated App Shortcuts experience, and make these new features immediately available via Siri, on the Lock Screen, in Control Center, and on the Action button nicely – the simplest and most likely outcome.
Overhaul Camera’s App Intents Support
However, I propose Apple give the Camera app a deeper App Intents review and consider splitting up the Open Camera action in alignment with the Camera app redesign, building out the longstanding Take Video and Take Photo actions from Workflow and including additional functionality as parameters.
Take Video could include modes (and App Shortcuts) for Video, Cinematic, Slo-Mo, and Timelapse, each with dependent parameters for front-/rear-facing cameras, zoom levels and rotate options, extra features, and video formats. Take Photo could include modes (and App Shortcuts) for Photo, Selfie, Portrait, Spatial, Pano, with the same additional functionality as parameters for each mode2
Adding both options as separate actions would deliver add long-desired functionality to the Camera apps’ existing actions and enable a wide array of creator-focused shortcuts based on hyper-specific shooting modes. Plus, these actions could still be turned into App Shortcuts, enabling everyday users to quickly access Dual Capture or landscape-in-portrait selfies on their new iPhone 17 as needed.
Apple – please make it easier to take landscape selfies!3
If you want to see this update, please duplicate my report4 in the Feedback app to signal to Apple that multiple users want this changed.
FYI according to the Alt Text on the Apple Support website, it is officially called “the Selfie Rotate button.” ↩
There may need to be some slight fudging of “modes” to make a pleasant App Shortcuts experience here, otherwise having both “normal,” “Selfie,” and “Landscape Selfie” versions of each as additional options might be too much – I can see why they might’ve chosen to avoid this route originally. That being said, they should go further with more actions rather than pulling back. ↩︎
There’s got to be a better way to say “enabling landscape selfies while holding iPhone vertically” (from 3:35) – I propose “landscape selfie” as the generic term. ↩︎
On iPhone 17, new Camera modes like Dual Capture and Selfie Rotate let users record from both cameras or film landscape selfies while holding iPhone vertically. These features aren’t available in Shortcuts or App Shortcuts, making them harder to access quickly.The simplest improvement would be adding Dual Capture and Selfie Video options to the existing Open Camera action. Longer term, Camera could gain full App Intents support by splitting Open Camera into Take Photo and Take Video actions with parameters for mode, camera, and format.Results Expected:I am expecting to find all Camera functionality, including Dual Capture, Selfie Video, and future modes, available in the Shortcuts app or App Shortcuts experiences for use from the Lock Screen, Control Center, or Action button. ↩
“I mean, that’s the bottom line is it’s a great idea. And like I said about Misty Studio1, all things considered, it does a pretty good job of being kind of a friendly face to building an AI model, but in the end, it’s like Shortcuts in that it’s not really that friendly.”
Fair enough – if it truly was, I’d have been out of a job for a long time.
For reference, they explained Misty Studio earlier: > “Misty Studio is a demo that Apple did for the M5. Misty Studio runs an open-source model locally” ↩
P.S. I apologize in advance to Jason for the URL slug 🙂↩
With the release of iOS 26, iPadOS 26, and macOS 26 this month, developers around the world are able to bring even more intelligent experiences right into their apps by tapping into the on-device large language model at the core of Apple Intelligence.
The Foundation Models framework allows developers to create new intelligence features that protect users’ privacy and are available offline, all while using AI inference that is free of cost.
You can now add intelligence to your apps for free on Apple platforms – and while it’s relatively simple today… that’s only for now.
App Intents are how Apple devices understand and interact with your app. They’re the foundation of features of like Shortcuts, Siri, Spotlight – and now Apple Intelligence. If you want your app to take advantage of the deepest parts of the Apple ecosystem, it starts with App Intents.
Following recent projects with Foodnoms, MindNode, and Tripsy, I now have availability for App Intents consulting this fall and into 2026. In addition to full start-to-finish projects, I’m introducing new flexible options:
Audits: a focused review of your existing intents, data models, and opportunities
Docs-only: structured documentation you can use with your team to implement directly
If you want Apple Intelligence to understand your app’s core features, or you want to deploy your app across the system to make a cohesive experience, I can help you design and deliver the following:
The unreleased Actions and Context portions of Apple Intelligence
App Intents, App Entities, and App Enums for your app
Automatically-generated instances of important intents as App Shortcuts
Spotlight, Siri, and Controls integrations
Custom Shortcuts to be distributed to users
Documentation on the new offerings
Share ongoing updates in a developer newsletter
Each engagement starts with a free, 1-hour call to asses your needs, discuss budgets and rates, and outline next steps – whether you’re working on a brand, part of a team, or an indie developer, we can find a solution that works for you.
You can learn more about my services, explore past client work, and watch my conference talks on my Consulting. If you’re ready to move forward, book a call with me directly to get started.
Let’s make your app one of the best citizens of the Apple ecosystem – ready for Apple Intelligence, Shortcuts, and beyond.
Use these to ask Apple’s on-device or Private Cloud Compute models, talk to ChatGPT, utilize Writing Tools, generate images with Image Playground, and create Memories in Photos.
Use Model: Allows you to enter a request, asks which model to prompt, then lets you ask Follow Up questions, and shows you the final response.
Pass through Writing Tools: For a given input, asks you to describe your change – then, creates a summary, key points, list, and table, plus proofreads, rewrites, and adjusts the tone. Produces Markdown-ready text, complete with auto-generated title.
Create with Image Playground: Asks you to describe an image or takes an image from input, then to choose an art style, then creates an image and shows it to you (plus saves it to Image Playground).
Create Memory in Photos: Asks you to describe a memory to create, then uses Apple Intelligence and the Photos app to generate a Memory for you.
New in iOS 26, iPadOS 26, macOS 26, watchOS 26, and visionOS 26
This update includes enhancements to the Shortcuts app across all platforms, including new intelligent actions and an improved editing experience. Shortcuts on macOS now supports personal automations that can be triggered based on events such as time of day or when you take actions like saving a file to a folder, as well as new integrations with Control Center and Spotlight.
New Actions (Editor’s note: shortened for sake of space)
Freeform
Image Playground, requires Apple Intelligence*
Mail
Measure
Messages
Screen Time
Sports
Photos
Reminders
Stocks
Use Model, requires Apple Intelligence*
Visual Intelligence, requires Apple Intelligence*
Voice Memos
Weather
Writing Tools, requires Apple Intelligence*
Updated Actions
For those building custom shortcuts, some actions have been updated:
“Calculate Expression” can now evaluate expressions that include units, including real time currency conversion rates, temperature, distance, and more
“Create QR Code” can now specify colors and styling
“Date” can now specify a holiday
“Find Contacts” can now filter by relationship
”Transcribe Audio” performance has been improved
“Show Content” can now display scrollable lists of items, like calendar events, reminders, and more
Shortcut Editor
For those building custom shortcuts, updates have been made to the shortcut editor:
Improved drag and drop and variable selection
Over 100 new icon glyphs are now available, including new shapes, transportation symbols, and more
Rich previews of calendar events, reminders, and more
The ability to choose whether shortcuts appear in Spotlight Search
macOS Improvements
Spotlight
Shortcuts can now accept input, like selected text from an open document, when being run from Spotlight.
Automations
Shortcuts can now be run automatically based on the following triggers:
Time of Day (“At 8:00 AM, weekdays”)
Alarm (“When my alarm is stopped”)
Email (“When I get an email from Jane”)
Message (“When I get a message from Mom”)
Folder (“When files are added to my Documents folder”)
File (“When my file is modified”)
External Drive (“When my external drive connects”)
Search and Take Action with Updates to Visual Intelligence
Visual intelligence, which builds on Apple Intelligence, now helps users learn and do more with the content on their iPhone screen. It makes it faster than ever for users to search, take action, and answer questions about the content they’re viewing across their apps.
Users can search for the content on their iPhone screen to find similar images across Google, as well as apps that integrate this experience, such as eBay, Poshmark, Etsy, and more. If there’s an object a user is interested in learning about, like a pair of shoes, they can simply press the same buttons used to take a screenshot and highlight it to search for that specific item or similar objects online. And with ChatGPT, users can ask questions about anything they’re viewing onscreen.
Continue playback of video: Visual Intelligence on iPhone 17 Pro
Updates to visual intelligence help users learn and do more with the content on their iPhone screen.
Visual intelligence enables users to summarize and translate text, as well as add an event from a flyer on their iPhone screen to their calendar, with a single tap.
Users can also take advantage of these capabilities by using visual intelligence with their iPhone camera through Camera Control, the Action button, and in Control Center.
And:
Build Intelligent Shortcuts
Shortcuts help users accomplish more faster, by combining multiple steps from their favorite apps into powerful, personal automations. And now with Apple Intelligence, users can take advantage of intelligent actions in the Shortcuts app to create automations, like summarizing text with Writing Tools or creating images with Image Playground.
Users can tap into Apple Intelligence models, either on device or with Private Cloud Compute to generate responses that feed into the rest of their shortcut, maintaining the privacy of information used in the shortcut. For example, users can create powerful Shortcuts like comparing an audio transcription to typed notes, summarizing documents by their contents, extracting information from a PDF and adding key details to a spreadsheet, and more.
In my latest addition to the Shortcuts Library, I updated my shortcuts for the TV app with expanded functionality, including many new functions, redesigned menus, and, critically, support for macOS.
I accomplished Mac support using Shell Scripting, a technique which I’m sharing for members:
Opening deep links into Mac apps
In order to open a URL into an existing application, the best method within Shortcuts is using the Run Shell Script action, which allows you to execute command-line utilities using shell scripts, similar to the Terminal.
In order to open any app using a shell script, the following command can be used – here’s an example for the TV app:
open -a TV
In order to open a URL within that app, one only needs to add the URL afterwards, like so:
open -a TV https://tv.apple.com/watch-now
In my shortcuts, I’ve used the Deep Link variable to pass in the appropriate URL for the TV app into the shell script on-demand – only if run from Mac.
This works great with my TV app shortcuts, which utilizes preset versions of the URLs from the tv.apple.com website (as well as web scraping from the Apple TV Marketing Toolbox).
On desktop, these links will activate and open the TV app for Mac – on mobile, they’ll go straight into the TV app for iPhone and iPad.
Add to Dock and Open URLs
I’ve also used this method to open links from web services like Reddit into desktop versions of the app, which I’ve created for a handful of social media apps with the “Add to Dock” feature added to macOS in recent years.
I can also see this technique being useful for native versions of web apps like Notion or Airtable – I’ll have to explore more there soon.
These are redesigned for my new approach to building shortcuts, which is less targeted at separate actions and includes a more-bundled approach – each shortcut provides more functionality in a targeted area.
My favorite is the new Watchlist shortcut – I’ve been working on a version of this for the year or so! Enjoy:
Open into the TV app: Presents a menu of sections in the TV app and opens the deep link into the app on iPhone, iPad, and Mac – options include Home, Search, Store, Sports, Apple TV+, and Library. When run from Apple Watch, opens the Apple TV app.
Add to my TV watchlist: Accepts a list of TV shows or movies, scrapes the results from Apple’s Marketing Toolbox, and lets you pick where to send the media – with options to open into the TV app, add to your Watchlist, send to Reminders, or copy the links.
Open sports in the TV app: Presents a menu of Sports sections available in the TV app, include overall Sports, plus MLS Season Pass and Major League Baseball, as well a dedicated section for your favorite home team.
Browse the TV Store: Presents a menu for opening into the TV app to the Store section, either directly using a deep link, using the iTunes actions in Shortcuts, or Apple’s RSS feeds for top movie and TV content – plus categories for dedicated “rooms” in the TV app for special content.
Open from Apple TV Plus: Presents menu options for opening into the Home, Shows, Movies, and Upcoming sections of Apple TV+ in the TV app, plus categories for genres.
My friends at Elgato have updated the Stream Deck app for Mac for version 7.0 with new features for creating virtual Stream Decks on your computer, key logic so each key have multi-tap abilities, weather updates in a new plugin, and quality-of-life features like showing whether an app is currently open.
Here’s how they describe the updates:
🎛️ Virtual Stream Deck — your on-screen workspace controller
Create unlimited virtual keys, customize actions and layouts, then pin them in place or summon to your cursor. It’s your OS sidekick, making every workflow fast and effortless. It’s Stream Deck on your computer, anywhere you go.
[…]
👇 Key Logic — multi-tap abilities
Assign up to three different actions to a single key using Key Logic. Perform a unique action based on how the key is pressed:
Press
Double press
Press and hold
For example, press to play/pause music, double press to skip tracks, or press and hold to go to the previous track.
[…]
⛅ Weather plugin – stay ahead of the forecast
The new Weather Plugin for Stream Deck puts live weather updates and forecasts at your fingertips, with minimal setup and configuration. Instantly see the sky’s latest mood and plan your day without ever picking up your phone or opening a browser.
[…]
🛠️ Improvements and bug fixes
The Open Application action now displays a green dot when the selected app is running.
You can now configure the Open Application action to either do nothing, close, or force quit the selected app when long-pressed.[…]