Categories
Developer

New App Intents and Apple Intelligence Consulting Availability

App Intents are how Apple devices understand and interact with your app. They’re the foundation of features of like Shortcuts, Siri, Spotlight – and now Apple Intelligence. If you want your app to take advantage of the deepest parts of the Apple ecosystem, it starts with App Intents.

Following recent projects with Foodnoms, MindNode, and Tripsy, I now have availability for App Intents consulting this fall and into 2026. In addition to full start-to-finish projects, I’m introducing new flexible options:

  • Audits: a focused review of your existing intents, data models, and opportunities
  • Docs-only: structured documentation you can use with your team to implement directly

If you want Apple Intelligence to understand your app’s core features, or you want to deploy your app across the system to make a cohesive experience, I can help you design and deliver the following:

  • The unreleased Actions and Context portions of Apple Intelligence
  • App Intents, App Entities, and App Enums for your app
  • Automatically-generated instances of important intents as App Shortcuts
  • Spotlight, Siri, and Controls integrations
  • Custom Shortcuts to be distributed to users
  • Documentation on the new offerings
  • Share ongoing updates in a developer newsletter

Each engagement starts with a free, 1-hour call to asses your needs, discuss budgets and rates, and outline next steps – whether you’re working on a brand, part of a team, or an indie developer, we can find a solution that works for you.

You can learn more about my services, explore past client work, and watch my conference talks on my Consulting. If you’re ready to move forward, book a call with me directly to get started.

Let’s make your app one of the best citizens of the Apple ecosystem – ready for Apple Intelligence, Shortcuts, and beyond.

Categories
Membership Tips & Tricks

Technique: Open URLs into Mac apps using Shell Scripting

In my latest addition to the Shortcuts Library, I updated my shortcuts for the TV app with expanded functionality, including many new functions, redesigned menus, and, critically, support for macOS.

I accomplished Mac support using Shell Scripting, a technique which I’m sharing for members:

This content is marked as members-only – you’ll need a membership to access it.

Categories
Shortcuts

Updated in the Shortcuts Library: TV app shortcuts

Hey members! I’ve just updated new folder in the Shortcuts Library — my set of TV app shortcuts.

These are redesigned for my new approach to building shortcuts, which is less targeted at separate actions and includes a more-bundled approach – each shortcut provides more functionality in a targeted area.

My favorite is the new Watchlist shortcut – I’ve been working on a version of this for the year or so! Enjoy:

  • Open into the TV app: Presents a menu of sections in the TV app and opens the deep link into the app on iPhone, iPad, and Mac – options include Home, Search, Store, Sports, Apple TV+, and Library. When run from Apple Watch, opens the Apple TV app.
  • Add to my TV watchlist: Accepts a list of TV shows or movies, scrapes the results from Apple’s Marketing Toolbox, and lets you pick where to send the media – with options to open into the TV app, add to your Watchlist, send to Reminders, or copy the links.
  • Open sports in the TV app: Presents a menu of Sports sections available in the TV app, include overall Sports, plus MLS Season Pass and Major League Baseball, as well a dedicated section for your favorite home team.
  • Browse the TV Store: Presents a menu for opening into the TV app to the Store section, either directly using a deep link, using the iTunes actions in Shortcuts, or Apple’s RSS feeds for top movie and TV content – plus categories for dedicated “rooms” in the TV app for special content.
  • Open from Apple TV Plus: Presents menu options for opening into the Home, Shows, Movies, and Upcoming sections of Apple TV+ in the TV app, plus categories for genres.

Check out the folder of TV app shortcuts on the Shortcuts Library.

 

Categories
Links Offsite Podcasts Siri Shortcuts

Members-Only Podcast #3: New actions in the iOS 26 beta

This content is marked as members-only – you’ll need a membership to access it.

View the archive of members-only podcast episodes.

Categories
News

Shortcuts gains actions for Apple Intelligence, Messages, and Notes checklists in iOS 26

In iOS 26, Apple is adding a series of exciting new actions to Shortcuts, with a heavy focus on Apple Intelligence including direct access to their Foundation Models with the new Use Model action.

Alongside that, Apple has actions for Writing Tools, Image Playground, and Visual Intelligence, plus the ability to Add Files to Freeform and Notes, Export in Background from the iWork apps, and new Find Conversation & Find Messages actions for the Messages app, among others.

Plus, new updates to current actions—like turning Show Result into Show Content—make existing functionality easier to understand.

Here’s everything that’s new – available now in Public Beta:

Apple Intelligence

 

The major focus of actions in iOS 26 is access to Apple Intelligence, both directly from the Foundation Models and indirectly through pre-built Writing Tools actions and Image Playground actions – plus a simple “Open to Visual Intelligence” action that seems perfectly suited for the Action button.

Use Model

  • Use Model
    • Private Cloud Compute
    • Offline
    • ChatGPT Extension

Writing Tools

  • Make Table from Text
  • Make List from Text
  • Adjust Tone of Text
  • Proofread Text
  • Make Text Concise
  • Rewrite Text
  • Summarize Text

Visual Intelligence

  • Open to Visual Intelligence

Image Playground

  • Create Image

Actions

Apple has added new actions for system apps and features, starting with an interesting Search action that pulls in a set number of results, similar to Spotlight.

Both Freeform and Notes got “Add File” actions, plus you can add directly to checklists in Notes now too. Apple put the Background Tasks to work with exporting from iWork apps, and nice-to-have actions for Sports, Photos, and Weather make it easier to take advantage of those apps.

Particularly nice is Find Conversations and Find Messages, the former of which works well with Open Conversation, and the latter of which is a powerful search tool.

Search

  • Search

Freeform

  • Add File to Freeform

Notes

  • Add File to Notes
  • Append Checklist Item

iWork

  • Export Spreadsheet in Background
  • Export Document in Background
  • Export Presentation in Background

Documents

  • Convert to USDZ

Sports

  • Get Upcoming Sports Events

Photos

  • Create Memory Movie

Messages

  • Find Conversations
  • Find Messages

Weather

  • Add Location to List
  • Remove Location from List

Updated

Apple continues to make Shortcuts actions easier to understand and adopt for new users, making small tweaks like clarifying Show Content and Repeat with Each Item.

Plus, existing actions like Calculate Expression, Translate, and Transcribe have benefitted from system-level improvements:

  • Show Result is now titled Show Content
  • Repeat with Each is now labeled “Repeat with Each Item” once placed
  • Open Board for Freeform now shows as App Shortcuts
  • Calculate Expression can accept real-time currency data
  • Translate has been improved
  • Transcribe has been improved
  • “Use Search as Input” added to Shortcut Input

Coming this Fall

These new actions are available now in Public Beta—install at your own risk—and will be fully available in the fall once iOS 26 releases.

There are also further improvements on the Mac, which gained Automations in Shortcuts—including unique File, Folder, and Drive automations only available on Mac—plus the ability to run actions directly in Spotlight. I’ll cover these in future stories – be sure check the features out if you’re on the betas.

I will update this post if any more actions are discovered in future betas, or if there’s anything I’ve missed here.

P.S. See Apple’s video “Develop for Shortcuts and Spotlight with App Intents” for the example shortcut in the header photo.

Categories
Developer

Here are Apple’s WWDC25 Developer Sessions on the Foundation Models Framework

At WWDC25, Apple expanded access to their Foundation Models to third-party developers, making intelligence features easier to implement while maintaining privacy.

With the framework, developers are able to access local, on-device models from Apple, make requests to Private Cloud Compute when needed, and can readily adopt tools like the Vision framework or SpeechAnalyzer.

In introducing these capabilities, Apple has produced the following Machine Learning & AI sessions:

Apple Developer sessions on Machine Learning & AI from WWDC2025

Intro

Foundation Models

MLX

Features

More

Explore all the Machine Learning & AI sessions from WWDC25, plus check out my recommended viewing order for the App Intents sessions.


P.S. Here’s the full list of sessions, no sections – copy these into your notes:

List of Apple Developer sessions on Machine Learning & AI from WWDC2025

Categories
Developer

Watch the WWDC2025 App Intents Developer Sessions In This Order

After announcing updates at WWDC, Apple released four new developer sessions directly related to App Intents—the API that lets Apple Intelligence understand and interact with apps—following up on sessions from years past.

Here are this year’s sessions – in my recommended viewing order:

  1. Get to Know App Intents (24:36)
  2. Explore new advances in App Intents (26:49)
  3. Develop for Shortcuts and Spotlight with App Intents (18:56)
  4. Design Interactive Snippets (7:28)

Start with the summary of the API, see what’s new this year, learn the most relevant ways users will interact with your app, and then take a look at advances in snippets – in 1 1/2 hours of focused viewing.

Enjoy – there’s lots to learn!

Check out all the Machine Learning & AI videos from WWC25 from Apple, plus check out my curated list of the Foundation Models framework sessions.

Categories
Announcements

Announcing My WWDC Meetup: Apple Intelligence Automators at CommunityKit

Hello friends! It’s my pleasure to announce my second-annual WWDC meetup 1, this time as part of the free CommunityKit conference under the name “Apple Intelligence Automators” – sign up here for the free event on Tuesday, June 10 from 2:00 PM – 4:00 PM.

Located inside the Hyatt House San Jose / Cupertino at 10380 Perimeter Rd in Cupertino (just a few minutes from Apple Park), we’ll be discussing the announcements from the WWDC keynote address and State of the Union from the day prior as it relates to Apple Intelligence, App Intents, and Shortcuts.

With Apple Intelligence being the focus of last year’s WWDC, and delays on those features pushing things back, we should have plenty to talk about.

Check out the event page on Luma to register and don’t forget to get your free ticket to CommunityKit.

  1. I hosted a Shortcuts meetup last year – and had a blast.
Categories
Developer

You Should Watch The Apple Intelligence Developer Sessions In This Order

(Editor’s note: updated June 2025 to include sessions from WWDC25)

If you’re getting into development for Apple Intelligence, it can be hard to understand how to parse Apple’s documentation. App Intents, the API that powers the Actions and Personal Context features of Apple Intelligence, has been shipping since 2021, with a deeper history since the introduction of Shortcuts in 2018 – there are over 30 sessions to learn from.

Since I’ve been consulting with developers on their App Intents integrations, I’ve developed a Star Wars Machete Order-style guide for the Apple Intelligence developer sessions – watch the sessions in this order to best understand how Apple thinks about these APIs.

Apple Intelligence Machete Order

How to understand the App Intents framework

Start with the latest sessions from 2024, which reintroduces App Intents as it extends across the system in more ways, as well as updates the Design suggestions from their earlier framework:

Getting Deeper into App Intents

From there, once you have the context of how App Intents can be deployed, start back at the beginning to see how to implement App Intents, then take a look at where they are heading with Snippets:

Importance of App Shortcuts

Built on top of App Intents, App Shortcuts are automatically generated for the most-important tasks and content that show up in Spotlight and Siri – and often the most-common way users interact with the App Intents framework:

Apple Intelligence sessions

Finally, once you understand the core of App Intents, what it used to be vs. what Apple wants you to do now, and how to deploy App Intents across Spotlight and Siri, move onto the latest updates for Apple Intelligence – new features that enable Personal Context, as well as integrating your intents into domains for Siri:

Good to know

Beyond that, it can be helpful to review earlier sessions to understand where Apple is coming from, as well learning about the lesser-known experience your app is capable of providing:

All the Apple Intelligence developer sessions

For good measure, here’s the full list of the Shortcuts / App Intents / Apple Intelligence developer sessions:

Check out more Machine Learning and AI videos from the Apple Developer site, read the full App Intents documentation, and learn more about Apple Intelligence.

P.S. You can hire to design your App Intents integration.

 

Categories
Announcements

6 Years Later: ‘Worth A Look. Something For Everyone.’

Earlier this evening, I was working on my soon-to-be-relaunched newsletter What’s New in Apple Intelligence, and I opened Reeder to find an article from Daring Fireball titled “15 Years Later: ‘Very Insight and Not Negative’.”

In the post, Gruber was recalling a quote from Steve Jobs’ comment about his blog, which he had inexplicably hadn’t linked to (until now) – upon reading that, I realized I had my own Tim Cook moment in a similar realm.

Six years ago, on February 28, 2019, I published a story for iMore where I collected over 100 apps that work with Siri, tweeting:

If you’ve been wanting to get into Siri Shortcuts but don’t know where to start…

Here are 100 apps with shortcuts you can add to Siri and use with your own custom phrase: https://x.com/iMore/status/1101285345390444545

The next day, at about 3pm, Tim Cook quoted my post, saying:

“Worth a look. Something for everyone.”

In the moment, I sent a simple “Thanks Tim!” back, but never posted about it on my blog. So, here we are – while you’re here, see what else I have that’s worth a look – there’s something for everyone.

Check out Tim Cook’s tweet and view the story on Daring Fireball.

Categories
News

Apple Releases “Hold That Thought” Shortcut for Global Accessibility Awareness Day

Today is Global Accessibility Awareness Day (GAAD), which Apple highlighted in their press release showcasing accessibility features coming in the next year – plus a new Accessibility shortcut called Hold That Thought:

New features include Accessibility Nutrition Labels on the App Store, Magnifier for Mac, Braille Access, and Accessibility Reader; plus innovative updates to Live Listen, visionOS, Personal Voice, and more.

Near the end of the release, Apple explains their new shortcut, plus the addition of the previous Accessibility Shortcut to Vision Pro:

The Shortcuts app adds Hold That Thought, a shortcut that prompts users to capture and recall information in a note so interruptions don’t derail their flow. The Accessibility Assistant shortcut has been added to Shortcuts on Apple Vision Pro to help recommend accessibility features based on user preferences.

Here’s how Apple describes the shortcut:

Interruptions can cause you to forget tasks and affect productivity, especially for neurodivergent individuals.

When you run this shortcut, you have two options: Capture and Recall.

Run the shortcut and select Capture to capture a screenshot of what you’re doing, any calendar events in the next hour, current open webpage in Safari (Mac only), and Clipboard contents. You’ll then be prompted to write short notes about what you are doing and what you are about to do. Run the shortcut again and select Recall to find the last created note with all the captured information. All notes will be saved with the title “Hold that thought” and the date and time saved.

Run this shortcut using Siri, or add it to the Control Center, Action button or to the Home Screen for quick access.

I love this idea, and the core concept matches the inspiration for my currently-secret app idea that I teased at the end of my Deep Dish Swift talk.

I do have a few suggestions for improvements to the shortcut, however:

  • Remove the errant space in the Choose From Menu prompt between “Capture” and “or” – it says “Capture or recall last stopping point?”
  • For both “? Capture” and “? Recall” options Choose From Menu, Apple should add Synonyms for “Capture” and “Recall” – the emoji can cause issues when dictating to Siri (in general, I avoid emoji in Menus for this reason).
  • Utilize the “Find Tabs” action for iOS instead of simple not adding any functionality for Safari on mobile; Apple’s use of only “Get Current Safari Tab” for Mac reminds me that they still have not added the set of Safari iOS actions added back in 2022 to macOS, and their absence in this shortcut furthers my belief that these highly-sought actions are deprioritized simply because the team doesn’t use iOS as often and this Mac action is “good enough”.
  • The second “Recall” option just opens the note, but I’d rather see that last item I saved – Apple should have gone further to isolate the recent item and display the recalled information, not just open it again. I tried to Recall from my Apple Watch and the shortcut simply failed.
  • The flow of an alert, a 5-second countdown before a screenshot, and two prompts might be too long for most neurodivergent people to capture information effectively while in the process of being interrupted.

To improve the shortcut as it is today, I’d simply remove the Show Alert and Wait actions, and assign this new shortcut to the Action button – that way you can immediately take a screenshot, then answer the prompts, and move on.

Going further, I’d love to see a new version of this next year once Apple Intelligence ships in full, which utilizes “Get On-Screen Content” and accesses all the data available from apps for Personal Context.

Get “Hold That Thought” for Shortcuts, view the announcement from the Apple Newsroom, and check out past updates from GAAD.

Categories
Siri Shortcuts

How Apple Will Win the AI Race: My Talk on App Intents & Apple Intelligence

Last Tuesday, I gave a talk to over 300 developers at Deep Dish Swift about Apple Intelligence, where I made the following claim:

Apple will win the AI race

I’m an expert on App Intents, the API that powers the yet-to-be-seen features of Apple Intelligence – Actions and Personal Context. After designing implementations with my clients, and seeing the trends around AI-assisted coding, hearing rumors of an iOS 19 redesign, and seeing the acceleration effects of artificial intelligence, I believe Apple is skating to where the puck will be, rather than where it is now.

I’ll leave the thesis for the talk – but if you’re building for any Apple devices, you’ll want to understand how important App Intents is to the future of the platform:

Watch the 54-minute talk from Deep Dish Swift on YouTube Live.

Categories
Announcements

No Ticket to WWDC? Come to CommunityKit, the New, Free Alt Conf

If you’re interested in “going” to WWDC, but don’t have a developer ticket – you should sign up for CommunityKit, the alternative conference1 for Apple developers, media, and fans.

From June 9 though 11, join us at the Hyatt House Cupertino to gather with your fellow participants, learn so many new things, and build some great memories.

​Each day, we’ll be joined by our wonderful communities, such as Shortcuts, and iOSDevHappyHour, to name a few. We’ll also be hosting a live recording of Swift over Coffee, focusing on everything new at WWDC.

Yes, you read that right – I’ll be hosting a Shortcuts/Apple Intelligence/App Intents meetup during one of the afternoons that week! Schedules will be announced later, and I’ll update this post plus create another when I know my official time slot.

Located just a few minutes away from Main Street Cupertino and the Visitor Center at Apple Park, this free conference is designed specifically to make it easy to know where to go if you’re in town for WWDC, merging past events like the watch party from the iOS Dev Happy Hour into one event.

You can watch the WWDC Keynote and State of the Union with developer friends on Monday, plus attend live podcast recordings, join community meetups like mine, and access a hackathon space to work on new ideas all day Tuesday & Wednesday.

To be clear: this means most social events are moving the from San Jose to being more focused in Cupertino this year, so folks don’t have to make their way back-and-forth across those 8 miles as much. This also means anyone coming from out-of-town or from San Francisco can stay/park at the Hyatt House each day and easily access most WWDC social events.

If you’re unsure if it’s worth coming to WWDC, let this post convince you – it’ll be a blast and you’ll have something fun to do to Monday, Tuesday, and Wednesday that week.

WWDC is back!2 Get your free ticket to CommunityKit now.


  1. Not to be confused with the now-defunct AltConf
  2. Yes, the official conference has been back for years. But I kept hearing people at Deep Dish Swift ask if the social WWDC is “back”.

    Yes, it is is! The social scene has been growing for a few years, but took a while to figure out better.

    Now, more of us are coordinating together to make it like the old days where, if you didn’t have a ticket, you could go to AltConf. Now, you can go to CommunityKit! 

 

 

  1.  
Categories
Siri Shortcuts

Tune In To My Apple Intelligence Talk via Deep Dish Swift Live

I’m super excited to be giving my talk on Apple Intelligence live tomorrow at Deep Dish Swift – if you’re interested in tuning in to the conference stream, follow Deep Dish Swift on YouTube:

Check out Deep Dish Swift live and learn more about the conference.

Categories
How To

How To Rotate Upside-Down Top-Down Camera Footage using the Final Cut Pro Browser

In the process of switching my mounted overhead video setup from a backdrop bar to the Elgato Multi-Mount, I had to make one significant shift – filming upside-down, since the camera is now attached to the back of the desk instead of mounted above from the front. Unfortunately, that means all of my footage needs to be rotated before being usable in editing programs.

In Final Cut Pro for Mac, you can easily rotate clips once you’ve added them to the timeline. However, I’m not actively building a story yet, and I’m instead using the Browser to organize my footage into individual clips using in/out points and Favorites. For a long recording like an unboxing, I can turn an hour of footage into a full set of individual moments as clips, all timed exactly for the action, renamed as needed, and split apart as separate entities in the Browser.

This process and my footage means, by default, all my Browser clips are also upside-down, and at first glance this seemed like a big problem for my editing style – timeline editing is very different than clipping in the Browser, and I might be out of luck.

However, thanks to “2old2care” on Reddit (great username), the solution lies in the “Open Clip” menu option, which I’ve never used before:

Yes, you can invert the clip in the browser. Select the clip, then under “Clip” menu select “Open Clip”. You can then go to transform and rotate the clip 180º. I don’t know of a way to create a batch in FCP to do this, although it can be done for the original clips using Compressor.

To save myself the trouble of remembering later, I took screenshots of the process – here’s my setup in Organize mode (under Window > Workspaces > Organize):

How to rotate clips within the Browser using Final Cut Pro

  1. Select the clip you want to rotate – use the filmstrip to identify which files were filmed upside-down.
  2. In the Menu Bar, navigate to Clip > Open Clip, which has no keyboard shortcut. Optionally, assign a keyboard shortcut under Final Cut Pro > Command Sets > Customize (or use ⌥ + ⌘ + K / Option + Command + K to customize immediately).
  3. In the Final Cut Pro window, the selected clip will open in its own timeline view. In the Inspector, select Transform and change the Rotation from 0° to 180°.
  4. In the center of the window, find the clip name and click the dropdown arrow next to it to reveal a context menu – close the clip to return to the full Browser view. The filmstrip will show the flipped clip as you scroll, however it will continue to show the original upside-down version in the static filmstrip until you leave the project and navigate back/refresh the window.
  5. Repeat for each upside-down clip.

As 2old2care mentioned, batch-processing files like this would be a more ideal solution – I’ll update this post if I find one.

Check out the source on Reddit, get the Multi-Mount from Elgato, and get Final Cut Pro for Mac from Apple.

Categories
Shortcuts

New in the Shortcuts Library: Perplexity shortcuts

I’ve just added a new folder to the Shortcuts Library — my set of Perplexity shortcuts for asking Perplexity to do research for you.

Use these to open the sections of the website, ask questions in new threads on iPhone and iPad, interact with the Mac app using keyboard shortcuts, go deeper on the Perplexity experience, and interact with the API:

Website

  • Open Perplexity AI: Opens the website for Perplexity AI in your default browser.
  • Open Perplexity Discover: Opens the Discover page from Perplexity, which curates top stories for you and summarizes them.
  • Open my Spaces in Perplexity: Opens the Spaces section of Perplexity, where you can create research and collaboration hubs built on top of Perplexity search.
  • Open Perplexity Library: Opens the Library section of Perplexity, where you can see Threads and Pages around searches you’ve performed.

iOS and iPadOS app

  • New Search in Perplexity: Opens Perplexity to a new, blank search using the Auto mode.
  • New Pro Search in Perplexity: Opens Perplexity to a new, blank search set to Pro mode, which acts as your conversational search guide. “Instead of quick, generic results, Pro Search engages with you, fine-tuning its answers based on your needs.”
  • Ask Perplexity: Prompts you to “Ask anything” before opening into Perplexity to search for your query.
  • Play Perplexity Discover: Immediately starts a Live Activity session for Perplexity Discover, using stories drawn from the Discover feed and spoken by ElevenLab’s voices.
  • Summarize articles with Perplexity: Creates a series of Threads in Perplexity for URLs shared as input, either from the Share Sheet or by detecting what’s on screen. Includes logic for multiple links, opening each URL in the background until the final query.

Mac app

  • Set up Perplexity for Mac: Opens the Mac app for Perplexity AI, resizing the window to 1024×770 and moving it to the center of the current display.
  • New Thread in Perplexity: Simulates the keyboard shortcut for Command + Shift + P, which activates the Perplexity search bar from anywhere.
  • Voice Mode in Perplexity: Simulates the keyboard shortcut for Command + Shift + M, which activates the Perplexity voice mode in a popover window.
  • Upload File in Perplexity: Simulates the keyboard shortcut for Command + Shift + U, which activates the Perplexity upload process and shows a Finder window where you can select the file to upload.
  • Voice Dictation in Perplexity: Simulates the keyboard shortcut for Command + Shift + D, which activates the Perplexity voice dictation in the search bar so you can enter a query hands-free.
  • Screen Capture with Perplexity: Simulates the keyboard shortcut for Command + Shift + 0, which activates the Perplexity screen capture and prompts whether to capture an Area, Window, or Fullscreen.
  •  

Deep Dive

  • Open my Perplexity account settings: Opens the Perplexity website to Settings > Account, where you can change general settings like the Appearance, as well as subscription details or system settings.
  • Edit my Perplexity profile: Opens the Perplexity settings to the Profile section, where you can tell Perplexity information about yourself and your preferences to inform results.
  • Read the Perplexity blog: Opens the Perplexity blog, where you can see stories and announcements from the team on new updates or changes to the service.
  • Open the Perplexity discord: Opens the deep link into Discord for the Perplexity channel using the unique ID and channel ID.
  • Show the Perplexity Supply store: Opens the website for Perplexity Supply, the clothing line for fans of Perplexity.

API

  • Open the Perplexity API docs: Opens the URL for the Perplexity API documentation, so you can quickly reference how to get started or learn about the API.
  • Open the Perplexity API reference: Opens the Perplexity API website to the API reference, starting with Chat Completions, you can test your commands against the API and see what’s working.
  • Get my Perplexity API key: Stores your API key for Perplexity. Store the result as base64-encoded text so it’s not readable as plain text, which is then decoded as this is run.
  • Manage my Perplexity API keys: Opens the Perplexity website to your API settings, where you can manage API keys and payment details.

Check out the folder of Perplexity shortcuts on the Shortcuts Library.

 

Categories
News

New developer APIs hint at ‘Personal Context’ for Apple Intelligence coming in iOS 18.4

The first look at Personal Context for Apple Intelligence is here as APIs available in the iOS 18.4 developer betas allow apps to further their content for the system to understand. This sets the stage for the most significant update to Siri so far, where all your apps can provide Siri with the available views and content to work with – in a secure and private manner, too.

As first mentioned by Prathamesh Kowarkar on Mastodon, there is now a suite of APIs in beta that associate an app’s unique content, called an entity, with a specific view – this allows Siri to read what’s indexed on-screen and use it with other app’s actions when triggered by a command.

APIs like this are necessary for the coming Siri update to actually do what Apple says Apple Intelligence is capable of – now that the functionality is here, however, it’s up to developers to implement everything to make sure the experience works well.

Here are the new pages:

If these APIs are in beta now, it stands to reason they’ll leave beta after iOS 18.4 releases in full – which means Personal Context might be coming as early as iOS 18.4.

Check out the post from Kowarkar on Mastodon.

 

Categories
Links

Apple unveils new Mac Studio, the most powerful Mac ever »

From the Apple Newsroom:

Apple today announced the new Mac Studio, the most powerful Mac ever made, featuring M4 Max and the new M3 Ultra chip. The ultimate pro desktop delivers groundbreaking pro performance, extensive connectivity now with Thunderbolt 5, and new capabilities in its compact and quiet design that can live right on a desk. Mac Studio can tackle the most intense workloads with its powerful CPU, Apple’s advanced graphics architecture, higher unified memory capacity, ultrafast SSD storage, and a faster and more efficient Neural Engine.

My M1 Mac mini from 2020 is also way overdue for an upgrade…

View the original.

Categories
Links

Apple introduces the new MacBook Air with the M4 chip and a sky blue color »

From the Apple Newsroom:

Apple today announced the new MacBook Air, featuring the blazing-fast performance of the M4 chip, up to 18 hours of battery life, a new 12MP Center Stage camera, and a lower starting price. It also offers support for up to two external displays in addition to the built-in display, 16GB of starting unified memory, and the incredible capabilities of macOS Sequoia with Apple Intelligence — all packed into its strikingly thin and light design that’s built to last.

I’ve been rocking the M1 MacBook Air from 2020, but it’s beyond time I upgraded…

View the original.

Categories
Shortcuts

Immediately Browse Apple News’ Food Recipe Catalog With These Shortcuts

New in iOS 18.4, Apple is making a new Food section available to Apple News+ subscribers, creating a curated browsing and recipe experience within the app. Located on iPhone under the Following tab and Food section, or in the Food section of the sidebar on iPad and macOS, this new category curates stories for you based on your chosen interests and browsing history, plus provides an entire Recipe Catalog and cooking experience for recipes with ingredients & instructions.

The entire experience for News+ Food is fantastic, albeit somewhat buried inside the News app – that’s why I’ve built a set of shortcuts to quickly access the sections from anywhere. In my folder of Apple News Food shortcuts, you can find shortcuts to access the main Food section, the Recipe Catalog, and two curated sections that are shown within the category for Healthy Eating and Kitchen Tools & Techniques.

You can use these with Siri, place them in a Medium widget, or even add them as Controls in Control Center or the Lock Screen – the Recipe Catalog would work great using Add to Home Screen as well, as Stephen Robles demonstrated in his video that highlights the Food feature.

So far, the News team at Apple has only ever created the Show Today Feed and Show Topic actions, and relied on the concept of “donations” (where an action only becomes available after the user interacts with a particular section) for sections like Magazines, Puzzles, and now the Recipe Catalog. Along this route, I’d love to see the Saved Recipes section available as a donated action, as well as being able to open directly to a saved recipe would make a lot of sense. But, going further, I wish the News team would adopt a full suite of actions like Get Recipes, Find Recipe, Save/Unsave Recipe, Cook Recipe, and Read The Story (for a recipe).

Get the folder of Apple News Food shortcuts in my Shortcuts Library (requires iOS 18.4).