Categories
News

Shortcuts gains actions for Apple Intelligence, Messages, and Notes checklists in iOS 26

In iOS 26, Apple is adding a series of exciting new actions to Shortcuts, with a heavy focus on Apple Intelligence including direct access to their Foundation Models with the new Use Model action.

Alongside that, Apple has actions for Writing Tools, Image Playground, and Visual Intelligence, plus the ability to Add Files to Freeform and Notes, Export in Background from the iWork apps, and new Find Conversation & Find Messages actions for the Messages app, among others.

Plus, new updates to current actions—like turning Show Result into Show Content—make existing functionality easier to understand.

Here’s everything that’s new – available now in Public Beta:

Apple Intelligence

 

The major focus of actions in iOS 26 is access to Apple Intelligence, both directly from the Foundation Models and indirectly through pre-built Writing Tools actions and Image Playground actions – plus a simple “Open to Visual Intelligence” action that seems perfectly suited for the Action button.

Use Model

  • Use Model
    • Private Cloud Compute
    • Offline
    • ChatGPT Extension

Writing Tools

  • Make Table from Text
  • Make List from Text
  • Adjust Tone of Text
  • Proofread Text
  • Make Text Concise
  • Rewrite Text
  • Summarize Text

Visual Intelligence

  • Open to Visual Intelligence

Image Playground

  • Create Image

Actions

Apple has added new actions for system apps and features, starting with an interesting Search action that pulls in a set number of results, similar to Spotlight.

Both Freeform and Notes got “Add File” actions, plus you can add directly to checklists in Notes now too. Apple put the Background Tasks to work with exporting from iWork apps, and nice-to-have actions for Sports, Photos, and Weather make it easier to take advantage of those apps.

Particularly nice is Find Conversations and Find Messages, the former of which works well with Open Conversation, and the latter of which is a powerful search tool.

Search

  • Search

Freeform

  • Add File to Freeform

Notes

  • Add File to Notes
  • Append Checklist Item

iWork

  • Export Spreadsheet in Background
  • Export Document in Background
  • Export Presentation in Background

Documents

  • Convert to USDZ

Sports

  • Get Upcoming Sports Events

Photos

  • Create Memory Movie

Messages

  • Find Conversations
  • Find Messages

Weather

  • Add Location to List
  • Remove Location from List

Updated

Apple continues to make Shortcuts actions easier to understand and adopt for new users, making small tweaks like clarifying Show Content and Repeat with Each Item.

Plus, existing actions like Calculate Expression, Translate, and Transcribe have benefitted from system-level improvements:

  • Show Result is now titled Show Content
  • Repeat with Each is now labeled “Repeat with Each Item” once placed
  • Open Board for Freeform now shows as App Shortcuts
  • Calculate Expression can accept real-time currency data
  • Translate has been improved
  • Transcribe has been improved
  • “Use Search as Input” added to Shortcut Input

Coming this Fall

These new actions are available now in Public Beta—install at your own risk—and will be fully available in the fall once iOS 26 releases.

There are also further improvements on the Mac, which gained Automations in Shortcuts—including unique File, Folder, and Drive automations only available on Mac—plus the ability to run actions directly in Spotlight. I’ll cover these in future stories – be sure check the features out if you’re on the betas.

I will update this post if any more actions are discovered in future betas, or if there’s anything I’ve missed here.

P.S. See Apple’s video “Develop for Shortcuts and Spotlight with App Intents” for the example shortcut in the header photo.

Categories
Custom Shortcuts

New in the Shortcuts Library: Emoji Game for Apple News+

I’ve just added a new shortcut to the Shortcuts Library to my set of Apple News shortcuts – a new shortcut for the Emoji Game in Apple News+.

Here’s how Apple describes the game:

“The object of this logic and word puzzle is to complete several phrases with as few moves as possible. Each emoji may be interpreted directly, through association, or in combination with other emoji. When you attempt an answer or expand a clue, it counts as a move.”

You can also do any of the following:

  • Try an answer: Consider the various definitions or associations for the emoji, then drag the most appropriate emoji (or group of emoji) to complete each word or phrase.For example, a “pear” emoji 🍐 could complete “DISAP_ _ _ _,” but interpreted as “fruit” it could complete “_ _ _ _ _ FUL.”Letters relating to emoji may appear nonconsecutively in an answer. For example, dragging an “earth” emoji 🌍 to “L_ _ _N _ _E ROPES” completes the phrase “learn the ropes.”Interpret grouped emoji as a whole. For example, a single 🐠 could mean “fish,” while 🐠🐠🐠 might mean “school.”
  • Expand a clue: Tap [Eye icon]. This counts as a move.
  • Reveal answers: Tap [Three Dots icon], then tap Reveal. The answers you didn’t find are shown. The puzzle doesn’t count in your Scoreboard stats and streaks.

Get the Emoji Game shortcut, check out the folder of Apple News Plus shortcuts, and browse the full Shortcuts Library – or read about Emoji Game from Apple Support.

Categories
Links

Apple News+ introduces Emoji Game 🍎📰➕ 😀🧩

From the Apple Newsroom:

Today, Apple News+ debuted Emoji Game, an original puzzle that challenges subscribers to use emoji to complete short phrases. Emoji Game is now available in English for Apple News+ subscribers in the U.S. and Canada.

“Emoji Game is the perfect addition to the Apple News+ suite of word and number puzzles, turning the emoji we use every day into a brainteaser that’s approachable and fun,” said Lauren Kern, editor-in-chief of Apple News.

More Apple News shortcuts incoming in 3… 2… 1…

View the full story from Apple.

 

Categories
Developer

Here are Apple’s WWDC25 Developer Sessions on the Foundation Models Framework

At WWDC25, Apple expanded access to their Foundation Models to third-party developers, making intelligence features easier to implement while maintaining privacy.

With the framework, developers are able to access local, on-device models from Apple, make requests to Private Cloud Compute when needed, and can readily adopt tools like the Vision framework or SpeechAnalyzer.

In introducing these capabilities, Apple has produced the following Machine Learning & AI sessions:

Apple Developer sessions on Machine Learning & AI from WWDC2025

Intro

Foundation Models

MLX

Features

More

Explore all the Machine Learning & AI sessions from WWDC25, plus check out my recommended viewing order for the App Intents sessions.


P.S. Here’s the full list of sessions, no sections – copy these into your notes:

List of Apple Developer sessions on Machine Learning & AI from WWDC2025

Categories
Developer

Watch the WWDC2025 App Intents Developer Sessions In This Order

After announcing updates at WWDC, Apple released four new developer sessions directly related to App Intents—the API that lets Apple Intelligence understand and interact with apps—following up on sessions from years past.

Here are this year’s sessions – in my recommended viewing order:

  1. Get to Know App Intents (24:36)
  2. Explore new advances in App Intents (26:49)
  3. Develop for Shortcuts and Spotlight with App Intents (18:56)
  4. Design Interactive Snippets (7:28)

Start with the summary of the API, see what’s new this year, learn the most relevant ways users will interact with your app, and then take a look at advances in snippets – in 1 1/2 hours of focused viewing.

Enjoy – there’s lots to learn!

Check out all the Machine Learning & AI videos from WWC25 from Apple, plus check out my curated list of the Foundation Models framework sessions.

Categories
Developer Links News

Apple Supercharges Its Tools and Technologies for Developers to Foster Creativity, Innovation, and Design »

From Apple’s announcements at WWDC:

App Intents lets developers deeply integrate their app’s actions and content with system experiences across platforms, including Siri, Spotlight, widgets, controls, and more.

This year, App Intents gains support for visual intelligence. This enables apps to provide visual search results within the visual intelligence experience, allowing users to go directly into the app from those results. For instance, Etsy is leveraging visual intelligence to enhance the user experience in its iOS app by facilitating faster and more intuitive discovery of goods and products.

“At Etsy, our job is to seamlessly connect shoppers with creative entrepreneurs around the world who offer extraordinary items — many of which are hard to describe. The ability to meet shoppers right on their iPhone with visual intelligence is a meaningful unlock, and makes it easier than ever for buyers to quickly discover exactly what they’re looking for while directly supporting small businesses,” said Etsy CTO Rafe Colburn.

Read the full post from the Apple Newsroom.

Categories
Developer Livestreams Offsite

AI + iOS: The State of Apple Development Ahead of WWDC (feat. Rudrank Riyam)

From my stream with Rudrank Riyam on YouTube Live — tune in:

This week, developer Rudrank Riyam, author of the AiOS Dispatch newsletter, joins me to talk about his experiences developing for Apple platforms using AI-assisted coding, and what we’re interested in ahead of Apple’s Worldwide Developer Conference (WWDC).

Subscribe to the newsletter here: https://aiosdispatch.com

View the stream live or catch the replay on YouTube.

Categories
Developer Livestreams Offsite

Apple Intelligence: Action Centered Design Framework (feat. Vidit Bhargava)

From my stream with Vidit Bhargava on YouTube Live — tune in:

This week, designer and developer Vidit Bhargava joins me to talk about his framework for app development centered around designing actions first, particularly as it relates to Apple Intelligence.

Read about the framework here: https://blog.viditb.com/action-centered-design-framework-talk/

Chapters (generated with Descript):

00:00 Introduction and Guest Welcome

02:32 Guest Background and App Development

03:45 Evolution of App Design

05:56 Action Centered Design Framework

08:46 Designing for Multiple Platforms

13:21 App Intents and Practical Examples

16:52 Future of App Design and AI Integration

48:14 Demo and Practical Applications

57:06 Exploring App Intents and Practicality

57:27 Challenges of Mobile AI Implementation

58:12 Battery Life and AI Advancements

58:53 Apple’s Approach to AI and Actions

01:01:35 The Future of Shortcuts and Automation

01:03:38 Innovative UI and Interaction Design

01:08:53 Custom Interactions and Maintenance

01:10:27 Generative Coding and Platform Variability

01:15:17 AI and App Intents in Real-World Applications

01:33:59 Economic Models for AI-Driven Apps

01:41:20 Concluding Thoughts and Future Prospects

View the stream live or catch the replay on YouTube.

Categories
Announcements Developer

Announcing My WWDC Meetup: Apple Intelligence Automators at CommunityKit

Hello friends! It’s my pleasure to announce my second-annual WWDC meetup 1, this time as part of the free CommunityKit conference under the name “Apple Intelligence Automators” – sign up here for the free event on Tuesday, June 10 from 2:00 PM – 4:00 PM.

Located inside the Hyatt House San Jose / Cupertino at 10380 Perimeter Rd in Cupertino (just a few minutes from Apple Park), we’ll be discussing the announcements from the WWDC keynote address and State of the Union from the day prior as it relates to Apple Intelligence, App Intents, and Shortcuts.

With Apple Intelligence being the focus of last year’s WWDC, and delays on those features pushing things back, we should have plenty to talk about.

Check out the event page on Luma to register and don’t forget to get your free ticket to CommunityKit.

  1. I hosted a Shortcuts meetup last year – and had a blast.
Categories
Developer

You Should Watch The Apple Intelligence Developer Sessions In This Order

(Editor’s note: updated June 2025 to include sessions from WWDC25)

If you’re getting into development for Apple Intelligence, it can be hard to understand how to parse Apple’s documentation. App Intents, the API that powers the Actions and Personal Context features of Apple Intelligence, has been shipping since 2021, with a deeper history since the introduction of Shortcuts in 2018 – there are over 30 sessions to learn from.

Since I’ve been consulting with developers on their App Intents integrations, I’ve developed a Star Wars Machete Order-style guide for the Apple Intelligence developer sessions – watch the sessions in this order to best understand how Apple thinks about these APIs.

Apple Intelligence Machete Order

How to understand the App Intents framework

Start with the latest sessions from 2024, which reintroduces App Intents as it extends across the system in more ways, as well as updates the Design suggestions from their earlier framework:

Getting Deeper into App Intents

From there, once you have the context of how App Intents can be deployed, start back at the beginning to see how to implement App Intents, then take a look at where they are heading with Snippets:

Importance of App Shortcuts

Built on top of App Intents, App Shortcuts are automatically generated for the most-important tasks and content that show up in Spotlight and Siri – and often the most-common way users interact with the App Intents framework:

Apple Intelligence sessions

Finally, once you understand the core of App Intents, what it used to be vs. what Apple wants you to do now, and how to deploy App Intents across Spotlight and Siri, move onto the latest updates for Apple Intelligence – new features that enable Personal Context, as well as integrating your intents into domains for Siri:

Good to know

Beyond that, it can be helpful to review earlier sessions to understand where Apple is coming from, as well learning about the lesser-known experience your app is capable of providing:

All the Apple Intelligence developer sessions

For good measure, here’s the full list of the Shortcuts / App Intents / Apple Intelligence developer sessions:

Check out more Machine Learning and AI videos from the Apple Developer site, read the full App Intents documentation, and learn more about Apple Intelligence.

P.S. You can hire to design your App Intents integration.

 

Categories
Gear How To Links

How Fast Should My Internet Be To Stream? »

Elgato has shared a helpful guide for ensuring your internet is fast enough for streaming:

Streaming your content live online is more accessible than ever and also can be equally data hungry. From streaming right from your phone to your followers on TikTok to streaming a professional event in 4K 60fps with High Dynamic Range on YouTube, these all require some amount of bandwidth to get your live content to where it needs to be.

And:

In short, here’s the maximum bitrates supported by the services.

Twitch: 6Mbps (up to 1080p)

YouTube: 40Mbps (Up to 4K)

Also, these are upload speeds – and if you’re using both, it can take even more. I use High Quality Audio and High Quality Video in Ecamm Live, which also add to the network load.

Check out the post from Elgato and get Ecamm Live to stream from your Mac.

Also, be sure to sure use my discount code ZZ-CASSINELLI for 5% off from Elgato.

Categories
Announcements

6 Years Later: ‘Worth A Look. Something For Everyone.’

Earlier this evening, I was working on my soon-to-be-relaunched newsletter What’s New in Apple Intelligence, and I opened Reeder to find an article from Daring Fireball titled “15 Years Later: ‘Very Insight and Not Negative’.”

In the post, Gruber was recalling a quote from Steve Jobs’ comment about his blog, which he had inexplicably hadn’t linked to (until now) – upon reading that, I realized I had my own Tim Cook moment in a similar realm.

Six years ago, on February 28, 2019, I published a story for iMore where I collected over 100 apps that work with Siri, tweeting:

If you’ve been wanting to get into Siri Shortcuts but don’t know where to start…

Here are 100 apps with shortcuts you can add to Siri and use with your own custom phrase: https://x.com/iMore/status/1101285345390444545

The next day, at about 3pm, Tim Cook quoted my post, saying:

“Worth a look. Something for everyone.”

In the moment, I sent a simple “Thanks Tim!” back, but never posted about it on my blog. So, here we are – while you’re here, see what else I have that’s worth a look – there’s something for everyone.

Check out Tim Cook’s tweet and view the story on Daring Fireball.

Categories
Custom Shortcuts News Shortcuts

Apple Releases “Hold That Thought” Shortcut for Global Accessibility Awareness Day

Today is Global Accessibility Awareness Day (GAAD), which Apple highlighted in their press release showcasing accessibility features coming in the next year – plus a new Accessibility shortcut called Hold That Thought:

New features include Accessibility Nutrition Labels on the App Store, Magnifier for Mac, Braille Access, and Accessibility Reader; plus innovative updates to Live Listen, visionOS, Personal Voice, and more.

Near the end of the release, Apple explains their new shortcut, plus the addition of the previous Accessibility Shortcut to Vision Pro:

The Shortcuts app adds Hold That Thought, a shortcut that prompts users to capture and recall information in a note so interruptions don’t derail their flow. The Accessibility Assistant shortcut has been added to Shortcuts on Apple Vision Pro to help recommend accessibility features based on user preferences.

Here’s how Apple describes the shortcut:

Interruptions can cause you to forget tasks and affect productivity, especially for neurodivergent individuals.

When you run this shortcut, you have two options: Capture and Recall.

Run the shortcut and select Capture to capture a screenshot of what you’re doing, any calendar events in the next hour, current open webpage in Safari (Mac only), and Clipboard contents. You’ll then be prompted to write short notes about what you are doing and what you are about to do. Run the shortcut again and select Recall to find the last created note with all the captured information. All notes will be saved with the title “Hold that thought” and the date and time saved.

Run this shortcut using Siri, or add it to the Control Center, Action button or to the Home Screen for quick access.

I love this idea, and the core concept matches the inspiration for my currently-secret app idea that I teased at the end of my Deep Dish Swift talk.

I do have a few suggestions for improvements to the shortcut, however:

  • Remove the errant space in the Choose From Menu prompt between “Capture” and “or” – it says “Capture or recall last stopping point?”
  • For both “? Capture” and “? Recall” options Choose From Menu, Apple should add Synonyms for “Capture” and “Recall” – the emoji can cause issues when dictating to Siri (in general, I avoid emoji in Menus for this reason).
  • Utilize the “Find Tabs” action for iOS instead of simple not adding any functionality for Safari on mobile; Apple’s use of only “Get Current Safari Tab” for Mac reminds me that they still have not added the set of Safari iOS actions added back in 2022 to macOS, and their absence in this shortcut furthers my belief that these highly-sought actions are deprioritized simply because the team doesn’t use iOS as often and this Mac action is “good enough”.
  • The second “Recall” option just opens the note, but I’d rather see that last item I saved – Apple should have gone further to isolate the recent item and display the recalled information, not just open it again. I tried to Recall from my Apple Watch and the shortcut simply failed.
  • The flow of an alert, a 5-second countdown before a screenshot, and two prompts might be too long for most neurodivergent people to capture information effectively while in the process of being interrupted.

To improve the shortcut as it is today, I’d simply remove the Show Alert and Wait actions, and assign this new shortcut to the Action button – that way you can immediately take a screenshot, then answer the prompts, and move on.

Going further, I’d love to see a new version of this next year once Apple Intelligence ships in full, which utilizes “Get On-Screen Content” and accesses all the data available from apps for Personal Context.

Get “Hold That Thought” for Shortcuts, view the announcement from the Apple Newsroom, and check out past updates from GAAD.

Categories
Gear How To Links

How To Archive Your Live Stream And Why To Do It »

Elgato has shared a handy guide for making sure you always have a copy of your livestream for later – something that tripped me up when I first started on Twitch:

You just finished a stream full of funny moments, interesting discussion, or a great final stand in a battle royale. But what happened to all that content? Did it just end up in the void or did you make sure to save it for later?

Some streaming services like Twitch.TV have time limits for how long they’ll hold onto your past livestream. If you want those moments to live on, you’ll need to archive them somehow. And in some cases, videos are simply unavailable after the stream due to copyright reasons. If you don’t enable storage of those past streams, it’s been too long, or you listened to some music on stream by accident, those moments could just be history.

If you stream right onto YouTube, and you’re already set as those will automatically be archived as a regular video, as long as you weren’t doing a subathon for over 12 hours. If all you want is for your streams to live on, you’re good to go.

Check out the post from Elgato and follow me on Twitch.

Also, be sure to sure use my discount code ZZ-CASSINELLI for 5% off from Elgato.

Categories
Apps Gear Links

The Stream Deck App Is More Mac-Like in Version 6.9

My friends at Elgato have updated the Stream Deck app for Mac for version 6.9 with new features for launching and closing an app from the same key, paste from clipboard in Text actions, the ability to choose your browser when opening URLs, and quality-of-life improvements specifically to make the app more Mac-like.

Here’s how they describe the updates:

🆕 Open App – From launch to close, all on one key

Launching your favorite apps just got easier. The new Open Application action shows a searchable list of installed programs—no file path hunting required.

And it now works with Windows Store (UWP) apps like Discord, Spotify, and Microsoft Teams.

✨ Pro tip: Press and hold the key to close the app. Quick in, quick out.

[…]

📝 Text Action – Now with Clipboard Paste Mode

The Text action now includes paste mode selection with options for simulating typing or pasting from clipboard. Simulate typing is exactly that—its as if you’re typing the text. Paste from Clipboard is new and text is pasted as if you’ve pressed CTRL/CMD+V. The default mode has been changed to Paste from Clipboard.

Simulate Typing is useful for programs that need inputs, like typing out commands in sim games

Paste from Clipboard is useful for chat applications, when you want to paste a block of text as a single message

[…]

🌐 Website Action – Choose Your Browser

Now you can choose exactly where your Website actions open. Chrome for work, Firefox for dev, Safari for testing—it’s all up to you.

Just pick from your installed browsers in the dropdown. The old “GET request in background” toggle now lives here too, tucked away for power users.

[…]

⬇️ Profiles just got smarter

Importing profiles from Marketplace? Stream Deck 6.9 now helps you hit the ground running. If a profile uses actions from plugins you don’t have, you’ll be prompted to install them automatically—no guesswork, no more question marks.

[…]

🪄 Quality of life improvements

A few small touches make a big difference:

Option to launch Stream Deck on startup

Option to disable automatic update checks

macOS only: Stream Deck now appears in the Dock when open

macOS only: You can now maximize the app using the green button

[…]

Check out the Elgato Stream Deck 6.9 Release Notes and get the Stream Deck from Elgato – be sure to sure use my discount code ZZ-CASSINELLI for 5% off.

Categories
Developer Shortcuts

How Apple Will Win the AI Race: My Talk on App Intents & Apple Intelligence

Last Tuesday, I gave a talk to over 300 developers at Deep Dish Swift about Apple Intelligence, where I made the following claim:

Apple will win the AI race

I’m an expert on App Intents, the API that powers the yet-to-be-seen features of Apple Intelligence – Actions and Personal Context. After designing implementations with my clients, and seeing the trends around AI-assisted coding, hearing rumors of an iOS 19 redesign, and seeing the acceleration effects of artificial intelligence, I believe Apple is skating to where the puck will be, rather than where it is now.

I’ll leave the thesis for the talk – but if you’re building for any Apple devices, you’ll want to understand how important App Intents is to the future of the platform:

Watch the 54-minute talk from Deep Dish Swift on YouTube Live.

Categories
Announcements Developer

No Ticket to WWDC? Come to CommunityKit, the New, Free Alt Conf

If you’re interested in “going” to WWDC, but don’t have a developer ticket – you should sign up for CommunityKit, the alternative conference1 for Apple developers, media, and fans.

From June 9 though 11, join us at the Hyatt House Cupertino to gather with your fellow participants, learn so many new things, and build some great memories.

​Each day, we’ll be joined by our wonderful communities, such as Shortcuts, and iOSDevHappyHour, to name a few. We’ll also be hosting a live recording of Swift over Coffee, focusing on everything new at WWDC.

Yes, you read that right – I’ll be hosting a Shortcuts/Apple Intelligence/App Intents meetup during one of the afternoons that week! Schedules will be announced later, and I’ll update this post plus create another when I know my official time slot.

Located just a few minutes away from Main Street Cupertino and the Visitor Center at Apple Park, this free conference is designed specifically to make it easy to know where to go if you’re in town for WWDC, merging past events like the watch party from the iOS Dev Happy Hour into one event.

You can watch the WWDC Keynote and State of the Union with developer friends on Monday, plus attend live podcast recordings, join community meetups like mine, and access a hackathon space to work on new ideas all day Tuesday & Wednesday.

To be clear: this means most social events are moving the from San Jose to being more focused in Cupertino this year, so folks don’t have to make their way back-and-forth across those 8 miles as much. This also means anyone coming from out-of-town or from San Francisco can stay/park at the Hyatt House each day and easily access most WWDC social events.

If you’re unsure if it’s worth coming to WWDC, let this post convince you – it’ll be a blast and you’ll have something fun to do to Monday, Tuesday, and Wednesday that week.

WWDC is back!2 Get your free ticket to CommunityKit now.


  1. Not to be confused with the now-defunct AltConf
  2. Yes, the official conference has been back for years. But I kept hearing people at Deep Dish Swift ask if the social WWDC is “back”.

    Yes, it is is! The social scene has been growing for a few years, but took a while to figure out better.

    Now, more of us are coordinating together to make it like the old days where, if you didn’t have a ticket, you could go to AltConf. Now, you can go to CommunityKit! 

 

 

  1.  
Categories
Developer Shortcuts

Tune In To My Apple Intelligence Talk via Deep Dish Swift Live

I’m super excited to be giving my talk on Apple Intelligence live tomorrow at Deep Dish Swift – if you’re interested in tuning in to the conference stream, follow Deep Dish Swift on YouTube:

Check out Deep Dish Swift live and learn more about the conference.

Categories
Developer Livestreams Offsite

The Future of App Development: Artificial Intelligence and App Intents (feat. Connor Hammond)

From my stream with Connor Hammond on YouTube Live — tune in:

AI consultant and app developer Connor Hammond joins me to talk about the future of app development in a world of AI, particularly in relation to Apple’s App Intents APIs.

We discussed questions such as:

  • What does it mean to develop with AI at your side?
  • As AI tools speed up development, how can app developers harness new capabilities within Apple’s schedule?
  • If App Intents is Apple’s strategy, what does that mean for all AI platform?
  • How can apps take advantage of App Intents to deploy their functionality across Apple’s platforms?
  • How do AI-enabled apps provide more value than the AI tools themselves?

View the stream live or catch the replay on YouTube.

Categories
How To

How To Rotate Upside-Down Top-Down Camera Footage using the Final Cut Pro Browser

In the process of switching my mounted overhead video setup from a backdrop bar to the Elgato Multi-Mount, I had to make one significant shift – filming upside-down, since the camera is now attached to the back of the desk instead of mounted above from the front. Unfortunately, that means all of my footage needs to be rotated before being usable in editing programs.

In Final Cut Pro for Mac, you can easily rotate clips once you’ve added them to the timeline. However, I’m not actively building a story yet, and I’m instead using the Browser to organize my footage into individual clips using in/out points and Favorites. For a long recording like an unboxing, I can turn an hour of footage into a full set of individual moments as clips, all timed exactly for the action, renamed as needed, and split apart as separate entities in the Browser.

This process and my footage means, by default, all my Browser clips are also upside-down, and at first glance this seemed like a big problem for my editing style – timeline editing is very different than clipping in the Browser, and I might be out of luck.

However, thanks to “2old2care” on Reddit (great username), the solution lies in the “Open Clip” menu option, which I’ve never used before:

Yes, you can invert the clip in the browser. Select the clip, then under “Clip” menu select “Open Clip”. You can then go to transform and rotate the clip 180º. I don’t know of a way to create a batch in FCP to do this, although it can be done for the original clips using Compressor.

To save myself the trouble of remembering later, I took screenshots of the process – here’s my setup in Organize mode (under Window > Workspaces > Organize):

How to rotate clips within the Browser using Final Cut Pro

  1. Select the clip you want to rotate – use the filmstrip to identify which files were filmed upside-down.
  2. In the Menu Bar, navigate to Clip > Open Clip, which has no keyboard shortcut. Optionally, assign a keyboard shortcut under Final Cut Pro > Command Sets > Customize (or use ⌥ + ⌘ + K / Option + Command + K to customize immediately).
  3. In the Final Cut Pro window, the selected clip will open in its own timeline view. In the Inspector, select Transform and change the Rotation from 0° to 180°.
  4. In the center of the window, find the clip name and click the dropdown arrow next to it to reveal a context menu – close the clip to return to the full Browser view. The filmstrip will show the flipped clip as you scroll, however it will continue to show the original upside-down version in the static filmstrip until you leave the project and navigate back/refresh the window.
  5. Repeat for each upside-down clip.

As 2old2care mentioned, batch-processing files like this would be a more ideal solution – I’ll update this post if I find one.

Check out the source on Reddit, get the Multi-Mount from Elgato, and get Final Cut Pro for Mac from Apple.