My @interaction Poke shortcut for Apple Watch Ultra’s action button.
Dictate Text + Send Message – nice & simple, and a neat use of the Action Button (on either iPhone or Apple Watch).
For anyone who doesn’t know, Poke is “your proactive AI assistant that turns your emails into action” and is known on X for their AI’s cheeky-but-well-done personality as well as smooth integrations.
There are new Find Conversation and Find Messages actions for Shortcuts that I haven’t seen anyone play with yet – those could be an interesting addition to this workflow.
Also the Messages automation could be fun – you could flash the lights when you get a Poke message, for example.1
I’ve just added a set of new folders to the Shortcuts Library, all for Notion — a set for the main Notion app, menus of special Pages, and then Notion Mail & Notion Calendar:
Notion
My main set of Notion shortcuts are designed for the main app experiences – New Page would be great for the Action button:
Open my Notion home page: Opens the URL to the Home page in your Notion workspace, which shows recently visited pages, upcoming events, database views, and featured templates.
Open Meetings in Notion: Opens the URL to the top-level Meetings page in Notion where you can see upcoming meetings, start transcribing, and browse past AI meeting notes.
Open Notion AI: Opens the URL for Notion AI so you can “Ask, search, or make anything…” – plus add context from with a Page from Notion. You can also attach files, choose a model, use Research mode, or start a Web search – as well as get started with a few shortcuts.
Open Notion Mail: Opens Notion Mail by checking if the app is present; if not, opens the URL.
Open Notion Calendar: Opens Notion Calendar by checking if the app is present; if not, opens the URL.
Open the Marketplace in Notion: Opens the URL for the template Marketplace, where you can find and purchase predesigned setups for Notion.
Get Help with Notion: Opens the URL to the Help and Documentation resources from Notion where you can search for anything and learn from Notion Academy.
For my set of Notion Pages shortcuts, you can copy different IDs and Views from various pages, then use the others to open into those pages – whether within one team, a whole teamspace, or your entire workspace:
Open page in Side Peek: Proof of concept of the URL pattern in Notion for taking a Page URL and opening it in Side Peek as a Subpage of a particular View.
Copy View from Notion link: Gets a URL from input, matches the URL structure for a page ID and the associated View ID, and copies the View identifier to the clipboard. Can also be set to extract both IDs in an array.
Copy ID from Notion link: Gets a URL from input, matches a 32-character string from the text (and gets the first item in case there’s also a View ID), and copies that identifier to the clipboard.
Open from a team Pages in Notion: Presents a preset list of Pages from particular workspace, then takes the corresponding ID and opens the URL in Notion.
Open a Teamspace in Notion: Use my shortcut “Copy ID from Notion link” to easily extract the ID for any Teamspace add it to the Text action below.
Open from my Teamspaces in Notion: Presents a menu of your Teamspaces to choose from, then takes the corresponding ID and opens the URL in Notion.
Open from my Workspace pages in Notion: Presents a menu of Teamspace titles to choose from, then a preset list of Pages from that workspace, then takes the corresponding ID and opens the URL in Notion.
If you’re a Notion Mail user, you can use my set of Notion Mail shortcuts to open into the various pages on the Mac – I’m still trying to figure out the iOS URL scheme and unfortunately it’s not available on iPad:
Open Settings in Notion Mail: Opens the URL to the default Settings page for Notion Mail, which shows Inbox settings. Other options include Notion AI, Gmail filters, Snippets, Signature, and Account, plus links to Members and Plans for the Workspace.
Open Snippets in Notion Mail: Opens the URL to the Snippets section of Settings in Notion Mail where you can create new snippets, and edit existing snippets, plus change their icon or shortcut.
Open the Trash in Notion Mail: Opens the Trash can in Notion Mail where you can see recently-deleted emails and recover any before they expire.
Open Spam in Notion Mail: Opens the URL to the Spam section of Notion Mail so you can see if anything important slipped through and delete the rest.
Open Drafts in Notion Mail: Opens the URL to your unsent Drafts in Notion Mail where you can continue where you left off.
Open Sent in Notion Mail: Opens the URL to your Sent messages in Notion Mail, where you can see past emails of yours grouped by date.
Open All Mail in Notion Mail: Opens the link to the All Mail section of Notion Mail which shows unread, read, and archived emails.
Open Search in Notion Mail: Opens the link to the Search field in Notion Mail so you can start typing your query in the search box.
If you’re a fan of Notion calendar, you can use my set of Notion Calendar shortcuts to quickly jump into various sections of the website:
Open to a Month in Notion: Asks you to pick a date, then extracts just the year and month values, then opens the URL in the Month view in Notion Calendar. Defaults to next month.
Open to a Week in Notion: Asks you to pick a date, then gets the start of the week and opens the URL to the Week view in Notion Calendar. Defaults to the start of next week.
Open to a Day in Notion: Asks you to pick a date, then extracts the month, year, & day values, then opens the URL to the Day view in Notion Calendar. Defaults to tomorrow.
Open to any Date in Notion: Asks you to pick a date, then a view mode, then opens the URL for that view. For Week view, shows from the start of the week.
I’ve just updated my folder to the Shortcuts Library — my set of Remote Login shortcuts for using the Run Script over SSH action to perform remote login actions for your Mac devices.
Use these to set up a dictionary of the IP addresses for your Mac devices, then run the shortcuts or use Siri to wake or sleep the devices. Includes individual options for my Mac mini and MacBook Air:
Wake my Mac: Sends a remote command via the IP address of your Mac to wake up, then simulates keystrokes to keep it awake. Asks to pick between your Mac devices using my shortcut “Get the IP Addresses for my Macs.”
Sleep my Mac: Sends a remote command via the IP address of your Mac to wake up, then simulates keystrokes to keep it awake. Asks to pick between your Mac devices using my shortcut “Get the IP Addresses for my Macs.”
Wake my MacBook Air: Sends a remote command via the IP address of your MacBook Air to wake up, then simulates keystrokes to keep it awake.
Sleep my MacBook Air: Sends a remote command to a Mac to go to sleep immediately.
Wake my Mac mini: Sends a remote command via the IP address of your Mac mini to wake up, then simulates keystrokes to keep it awake.
Sleep my Mac mini: Runs a shell script or sends a remote command via the IP address of your Mac to go to sleep immediately. Works well with Stream Deck.
Run Script over SSH demo: Example shortcut that demonstrates the capabilities of Run Script over SSH to control your Mac remotely (and Run AppleScript when not triggered remotely).
Get the IP Addresses for my Macs: Outputs a predefined dictionary of the title and IP address for each of your Mac devices. On import, asks you to enter the IP addresses.
Modeled after the iPod Sock, the AirPods Beanies are a set of 4 cloth sleeves for your AirPods, one in each color – my partner and I have been sharing them for the last four years. Designed to slide over the AirPods case, the sleeve protects them in your pocket from scratches or dings, plus gives it a bit of grip to avoid sliding out of your pocket when sitting back in a chair.
On the actual iPhone Pocket itself – I am not immediately in love with the price, or the aesthetics, or reality of what it’s like to use this product. But that’s not really the point – it’s a collaboration with a designer, and an exploration of a product.
The Crossbody Strap is designed to attach to select Apple cases for a convenient and hands-free way to wear iPhone.
Beautifully crafted from 100 percent recycled PET yarns, the smooth, narrow woven straps drape comfortably across the body.
Embedded flexible magnets with stainless steel sliding mechanisms allow you to effortlessly adjust the length for the perfect fit, while keeping both straps securely and neatly aligned.
Perhaps we’ll see something similar from Apple themselves in response to the iPhone Pocket? AirPods Socks?
“iPhone Pocket is a collaboration between Apple and ISSEY MIYAKE. Based upon a mutual respect and shared approach to design, it’s inspired by the concept of “a piece of cloth” and features a singular 3D-knitted construction designed to fully enclose iPhone, while expanding to fit your everyday items.
Featuring a ribbed mesh structure with the qualities of the original pleats patented by ISSEY MIYAKE, iPhone Pocket is a beautiful way to wear and carry iPhone. When stretched, the open textile subtly reveals its contents and allows you to peek at your iPhone display. Born out of the idea of creating an additional pocket, while also being playful and versatile, iPhone Pocket is available in a short strap length (in eight colors), and a long strap length (in three colors), suitable for a variety of wearing styles – handholding, tying onto bags, or wearing directly on your body.” ↩
Nooo, the App Store logo is cleverly made from 3, tilted app icons. I made a little 3D printed model to visualize it, makes for a cool desk ornament https://t.co/OLXPBE33gZpic.twitter.com/agcncxvEdZ
Edit: More updates from Edward Sanchez, former designer at Apple – the original version is indeed a 3D rendering:
I hereby raise a challenge to any 3D artist to correctly recreate the app store icon based on the Instruments icon mesh – and show a 360 rotation of it. It’s surprising, and as you can see, not sticks. I can’t share the source, but I can reward the winner with a like and retweet. pic.twitter.com/fejVd5yTTt
If you’ve seen any linked posts on my blog, you’ve probably seen Stephen Robles – a prolific content creator who’s made a massive impact in the Shortcuts community the last few years. Just last week, Stephen announced that, thanks to the support of his community, he’s gone solo and left his job to pursue YouTube & his Shortcuts membership full-time.
My congratulations to Stephen – he’s an amazing person, a hard worker, and has given me so value with his Shortcuts work.
Become a member of the community and gain access to my ShortcutsGPT, Shortcut of the Week, searchable Shortcuts database, priority Shortcuts requests, and more!
It’s the idea that there’s a personal data trove that you have. And then you’re asking the model to query the contents. […] You know about all this stuff, now do stuff with it.
And if you can do that on device, it’s really powerful. [I]t’s hard to do that in the cloud because you would actually need to upload it to Private Cloud Compute. And that’s a lot of data. So what you want is some parsing to happen on device.
But that’s the dream, right? Is that your phone knows about your stuff and then the LLM that’s running can make uh distinctions based on your stuff.
And ideally the model could potentially, and I know this is wild, I don’t think they’ve talked about it, but ideally the model could query your personal context, get a bunch of related data out, and then send that in a query to the private cloud and have it process it, right? You could have a kind of cherry picking your personal data model on device that then kicks it to the more powerful model to process it and intuit things about it.
There’s lots of ways you could do this. It’s a great idea. It was a great idea in 2024 when they showed it, but they got to do it – is the challenge there.
In reply to Jason, cohost Myke Hurley said the following:
So, I’m just going to make a little prediction. These[…] things that I’ve spoken about, we will see iOS 27 before [they] ship.
I believe they will have stuff – like, I believe they will have [releases] in the spring like it has been rumored, but I don’t think all of these things.
I think particularly the Personal Context thing… we may never see that.
For what it’s worth Apple has done this and named it Queries. Shortcuts users might better understand this as the Find actions, which allow actions to find and filter data from apps before using it in their shortcuts.
Introduced for developers alongside the App Intents API in 2022, Queries are how intents/actions retrieve entities/data from apps. In their most recent session “Get to know App Intents” from 2025, they explicitly say the following – a phrase that caught my attention in regards to the “new Siri” we’ve been waiting for:
Queries are the way the system can reason about my entities
Apple has also been building out their ability to index and query these entities through their Spotlight support, as well as now Visual Intelligence.
The 3D Marvel films on the AVP look better than anyone has ever seen them before. The capabilities of the VisionPro are really unique and we remastered all the films for this format.
And:
Our goal was to match the color and brightness of the 2D HDR versions but for 3D. The Vision Pro delivers basically perfect stereo contrast so no ghosting, HDR color, UHD resolution and we did some work on the older titles as well.
With 3D movies, Disney’s storytelling will also leap off the screen like never before with remarkable depth and clarity for an unprecedented in-home 3D experience on Disney+ with Apple Vision Pro.
Marvel Studios’ The Fantastic Four: First Steps is now streaming on Disney+ and watching it in 4K 3D with Spatial Audio on Apple Vision Pro is just awesome 🤩
Feels like sitting in a private cinema with a giant screen and insane clarity. Add AirPods and you’re fully immersed 🍿 pic.twitter.com/s4k7DKbpyb
From the Apple Developer documentation (break added):
With visual intelligence, people can visually search for information and content that matches their surroundings, or an onscreen object.
Integrating your app with visual intelligence allows people to view your matching content quickly and launch your app for more detailed information or additional search results, giving it additional visibility.
And:
To integrate your app with visual intelligence, the Visual Intelligence framework provides information about objects it detects in the visual intelligence camera or a screenshot. To exchange information with your app, the system uses the App Intents framework and its concepts of app intents and app entities.
When a person performs visual search on the visual intelligence camera or a screenshot, the system forwards the information captured to an App Intents query you implement. In your query code, search your app’s content for matching items, and return them to visual intelligence as app entities. Visual intelligence then uses the app entities to display your content in the search results view, right where a person needs it.
To learn more about a displayed item, someone can tap it to open the item in your app and view information and functionality. For example, an app that allows people to view information about landmarks might show detailed information like hours, a map, or community reviews for the item a person taps in visual search.
Use visual intelligence to quickly learn more about what’s in front of you, whether in your physical surroundings or on your iPhone screen.
To learn more about your physical surroundings using your iPhone camera on models that have the Camera Control, just click and hold it to do things like look up details about a restaurant or business; have text translated, summarized, or read aloud; identify plants and animals; search visually for objects around you; ask questions; and more. […You can also] access visual intelligence by customizing the Action button or Lock Screen, or opening Control Center. See Alternate options to using the Camera Control.
To learn more about the content on your iPhone screen across your apps, simply press the same buttons you use to take a screenshot. You can search visually, ask questions, and take action, like turning a flyer or invite into a calendar event.
They discussed how Gurman got started covering Apple, the stories resurfacing this week around the Siri update being powered by Google’s Gemini, and the iPhone 17 lineup.
David Pierce, host of The Vergecast often complains that Shortcuts is too complicated and not useful. Equally as often, I tell him he’s wrong on social media, but this time I got to do it live! My thanks to David for inviting me on The Vergecast, and I’m pretty sure I won this round.
If you haven’t heard of TBPN, the “Technology Brothers Podcast Network” is an increasingly popular show by hosts Jordi Hayes and John Coogan that covers the major news of the day in the technology and business world – almost like a CNN for Silicon Valley. Streaming live from 11 AM – 2 PM PST every weekday, TBPN is known for high-profile guest interviews, clippable moments shared on social media, and a somewhat-irreverent tone paired with a deep knowledge & passion for the space.
I’m a fan of the way TBPN has given a breath of fresh air to technology coverage, simultaneously innovating on top of cable TV news, video & audio podcasts, and livestream formats in a new media organization for the current era. When things like AI are changing within a single week, the show provides a spotlight for understanding what’s going on as things move so quickly – and they demonstrate a better grasp of how to spread their content than any organization I’ve seen lately. The show has even evolved into a de-facto part of the technology media circuit, where having your startup’s news broken on TBPN is an indicator of success (much like getting coverage on TechCrunch).
With the show’s 3-hour runtime and multiple formats, it’s reasonable that an average listener won’t always engage with the entirety of each show, so it sure would helpful if there was some sort of way to access everything as needed… like a shortcut perhaps.
Me being me, I built a folder of shortcuts for TBPN for Apple’s Shortcuts app. These shortcuts let you listen to the show on Apple Podcasts and Spotify, watch the livestream or put the feed up on your TV, plus follow the team on X. Plus, as you’ll soon learn, the show is heavily sponsored by Ramp, so I created a cheeky shortcut for anyone to learn more from their website.
I also wrote a blog post about the technique for opening the TBPN livestream, which involves adding /live to any YouTube channel URL.
If you’re someone who enjoys watching livestreams on YouTube, you may not know about the permanent redirect for every YouTube channel that takes you straight to their current livestream or recent streams – just add/live to the channel URL.
I recently published a set of shortcuts for TBPN, the tech & business news podcast, which I like to watch live on YouTube occasionally – the shortcut “Watch TBPN Live” uses the /live redirect to the show. In this context, having one URL for both a live show or capturing a recent stream is ideal, because I can jump straight to the full shows – oftentimes the Home tab of YouTube channels are filled with clips or playlists, and this makes it easy to get straight to the latest full streams of any live video podcast.
My YouTube channel URL is https://www.youtube.com/@matthewcassinelli, and the URL for the Live page is technically https://www.youtube.com/@matthewcassinelli/streams. However, adding /live to the channel URL—https://www.youtube.com/@matthewcassinelli/live—creates a redirect that goes to that same streams page when I’m not live – or directly to the current livestream when I’m live on-air.
I love using this when directing people towards my own livestreams, because it provides a single, clean permalink that never changes and can be used in any social media post that’s written ahead of the stream. Once the show is over, I can use the actual video permalink to share the episode with other people, but ahead of time this single /live redirect is ideal for promotion.
Plus, since I can’t help myself, I built a shortcut for livestreams on YouTube that take advantage of this exact capability. My “Open livestreams for this channel” shortcut lets you take any current video URL, scrape the channel URL from its metadata, and redirect you to the livestreams page of that channel – so you can see their latest streams and even tune in immediately if they are live now. Try calling up the shortcut using Type to Siri next time you’re watching a video and check out if the channel does any livestreaming.
Use these shortcuts to watch the video podcast, tune into episodes on the go, and find the show on X – plus check out Ramp, of course:
Watch TBPN TV: Takes the RSS feed for TBPuTube channel and opens the most recent item. Includes option to AirPlay to the Apple TV, or opens in full-screen on Mac.
Watch TBPN Live: Opens the /live URL of TBPN’s YouTube channel, which redirects to either the current livestream or the page of recent streams.
Play TBPN on Apple Podcasts: Finds and plays the latest episode of TBPN in the Apple Podcasts app. Also follows the show if you’re not already.
One of the things that I think about from time to time is Apple’s collection of apps. Some are the crown jewels, like Apple’s pro apps, and others help an everyday consumer to tackle their iLife. All are pretty starved for attention and resources, outside of infrequent updates aligned with showing off the native power of Apple Silicon, Apple Intelligence, or demos of platform integration that never quite get all the way there.
Three things really brought this up to the surface for me recently: The neglect of Clips and iMovie, the radio silence regarding Pixelmator/Photomator, and Final Cut Pro being trotted out for demos but not shipping appropriate updates.
I agree with Joe’s sentiment, but direct it more towards—you guessed it—the Shortcuts app than Pixelmator, which I’ve been saying is within a reasonable window for updating – anything they’re working on could only feasibly ship after en entire yearly cycle.
Shortcuts, on the other hand, has been out for over 5 years and still hasn’t evolved too far beyond its original Workflow UX – Six Colors’ own Jason Snell just talked about how Shortcuts is not really that friendly on Monday of this week.
AI progress isn’t only about advancing intelligence—it’s about unlocking it through interfaces that understand context, adapt to your intent, and work seamlessly. That’s why we’re excited to share that OpenAI has acquired Software Applications Incorporated, makers of Sky.
And:
“We’ve always wanted computers to be more empowering, customizable, and intuitive. With LLMs, we can finally put the pieces together. That’s why we built Sky, an AI experience that floats over your desktop to help you think and create. We’re thrilled to join OpenAI to bring that vision to hundreds of millions of people.” —Ari Weinstein, Co-Founder and CEO, Software Applications Incorporated
Incredible run by my former teammates – first selling Workflow to Apple, and now Sky to OpenAI.
I’m super excited to see their deep talent and passion reflected in the ChatGPT app.
Accidentally wrote the perfect tweet – a technically-true point that can be wildly interpreted, which went viral on Twitter (and of course Grok got slightly wrong):
Former Apple engineer1 Matthew Cassinelli disclosed that the company conducts much of its internal operations using its own iWork apps, including Calendar, Contacts, Pages, Numbers, and Keynote, as a key aspect of its dogfooding practice. This revelation, shared on X and garnering over 15,000 engagements, highlights Apple’s commitment to testing products internally to drive improvements, with proprietary backend tools enhancing functionality for its needs. While users praised the apps’ integration and usability in areas like Keynote and Pages, criticisms focused on Contacts’ cumbersome interface and the occasional reliance on tools like Excel for complex tasks.
A few clarifications (you can find in more detail in the replies):
I both used and enjoyed these apps before I joined, but I had never seen a whole company committed to them.
Apple does not necessarily “conduct much of its internal operations” within iWork – they all have the apps and use them, but there are many other tools in place.
Employees are given these apps and use them by default, but do not exclusively use these tools; the people in Finance use Excel, for example, and anything specialized like CAD is also used – in addition to iWork.
Contacts and Calendar specifically are buoyed by an internal directory; Mail also has server-side rules that make filtering easy for them and not in the product.
I worked at Apple in 2017, so this is outdated – still true at a base level, but they’ve adopted more advanced tools like Slack since then.
On a basic level, Apple provides the apps because they make the apps, and it wouldn’t make sense to pay for a second set of tools for every employee while also not using your own freely-available product.
Plus, while you’re here – if you’re ever running into speed problems with apps like Contacts or Calendar, you should look into Shortcuts. For example, the new Use Model action for Apple Intelligence makes tasks like processing contact information much easier to build within a few steps.
For what it’s worth, I was a Product Specialist and not an engineer – I studied Business Administration and Marketing before joining Workflow. I became a programmer because of Workflow (now Shortcuts), but I don’t want to misrepresent myself as a former Apple engineer. ↩
Feel free to repost your own wildly-misinterpreted version of my point so I can hit Creator Monetization and get paid. ↩
On iPhone 17 models, Apple has added new hardware and software updates for advanced Camera features like Dual Capture and Center Stage, which allow capturing footage in more dynamic ways than ever.
Quickly accessing new features like this and forming muscle memory is critical to user adoption & long-term habits, which is why Apple should expand the Camera app’s Shortcuts support to everything new – something I’ve requested directly via the Feedback app in issue FB20772988 (Add Dual Capture and Selfie Video to Camera actions in Shortcuts).
Dual Capture and Selfie Rotate on iPhone 17
With any iPhone 17, you’re now able to capture both front-facing and rear-facing footage at the same time in a Dual Capture experience. This an awesome merging of hardware and software that creates a personal capturing experience I’ve loved since the Frontback days – a memory that says “here’s where I am”, but also “here’s who I am” (and “here’s who I’m with” too).
Plus, the selfie sensor has been expanded to a square size to allow both portrait and landscape capture, enabling features like a Selfie Rotate button to shoot in landscape while holding the phone vertically,1 as well as Center Stage functionality that automatically expands the shot depending on how many people are paying attention in-frame.
On The Stalman Podcast, Apple iPhone Product Manager Megan Nash specifically mentioned that holding the phone vertically created better eye gaze, which is otherwise awkward and often prevents people like me from filming themselves:
“You’ll notice people in the photos have better eye gaze because the camera preview is centered with the front camera, rather than being off to the side when you rotate iPhone to horizontal.”
These are incredible additions to the lineup and the primary reason I was excited to upgrade this year, both of which will make everyday content creation easier and also more dynamic.
Expand Camera’s App Shortcuts Support
I’m proposing that Apple add these features into the Camera app’s Shortcuts support, either in the form of expanded App Shortcuts or an overhaul to the Camera actions.
Currently, in Shortcuts, the Camera app has a single action, Open Camera, that opens the camera in a specified mode. As of writing, you’re able to choose from Selfie, Video, Portrait, Portrait Selfie, Photo, Cinematic, Slo-Mo, Time-Lapse, Pano, Spatial Video, and Spatial Photo.
Crude rendering by yours truly.
The simplest update would be to include options for Dual Capture and Landscape Selfies, allowing a quick addition to existing functionality. This would build upon the curated App Shortcuts experience, and make these new features immediately available via Siri, on the Lock Screen, in Control Center, and on the Action button nicely – the simplest and most likely outcome.
Overhaul Camera’s App Intents Support
However, I propose Apple give the Camera app a deeper App Intents review and consider splitting up the Open Camera action in alignment with the Camera app redesign, building out the longstanding Take Video and Take Photo actions from Workflow and including additional functionality as parameters.
Take Video could include modes (and App Shortcuts) for Video, Cinematic, Slo-Mo, and Timelapse, each with dependent parameters for front-/rear-facing cameras, zoom levels and rotate options, extra features, and video formats. Take Photo could include modes (and App Shortcuts) for Photo, Selfie, Portrait, Spatial, Pano, with the same additional functionality as parameters for each mode2
Adding both options as separate actions would deliver add long-desired functionality to the Camera apps’ existing actions and enable a wide array of creator-focused shortcuts based on hyper-specific shooting modes. Plus, these actions could still be turned into App Shortcuts, enabling everyday users to quickly access Dual Capture or landscape-in-portrait selfies on their new iPhone 17 as needed.
Apple – please make it easier to take landscape selfies!3
If you want to see this update, please duplicate my report4 in the Feedback app to signal to Apple that multiple users want this changed.
FYI according to the Alt Text on the Apple Support website, it is officially called “the Selfie Rotate button.” ↩
There may need to be some slight fudging of “modes” to make a pleasant App Shortcuts experience here, otherwise having both “normal,” “Selfie,” and “Landscape Selfie” versions of each as additional options might be too much – I can see why they might’ve chosen to avoid this route originally. That being said, they should go further with more actions rather than pulling back. ↩︎
There’s got to be a better way to say “enabling landscape selfies while holding iPhone vertically” (from 3:35) – I propose “landscape selfie” as the generic term. ↩︎
On iPhone 17, new Camera modes like Dual Capture and Selfie Rotate let users record from both cameras or film landscape selfies while holding iPhone vertically. These features aren’t available in Shortcuts or App Shortcuts, making them harder to access quickly.The simplest improvement would be adding Dual Capture and Selfie Video options to the existing Open Camera action. Longer term, Camera could gain full App Intents support by splitting Open Camera into Take Photo and Take Video actions with parameters for mode, camera, and format.Results Expected:I am expecting to find all Camera functionality, including Dual Capture, Selfie Video, and future modes, available in the Shortcuts app or App Shortcuts experiences for use from the Lock Screen, Control Center, or Action button. ↩