Categories
Gear

Move Over iPhone Pocket, I Want AirPods Beanies Back

With the news that Apple has released an eyebrow-raising $149 knitted sleeve called the iPhone Pocket (at a whopping $249 in Long size)1, I am reminded of a much more affordable and practical product from Native Union – the now-discontinued AirPods Beanies.

Modeled after the iPod Sock, the AirPods Beanies are a set of 4 cloth sleeves for your AirPods, one in each color – my partner and I have been sharing them for the last four years. Designed to slide over the AirPods case, the sleeve protects them in your pocket from scratches or dings, plus gives it a bit of grip to avoid sliding out of your pocket when sitting back in a chair.

I love my AirPods Beanies and hope to see Native Union bring them back, especially at the $19.99 price. Until then, you can see their retrospective from the launch, which coincided with the 20th anniversary of the iPod.

On the actual iPhone Pocket itself – I am not immediately in love with the price, or the aesthetics, or reality of what it’s like to use this product. But that’s not really the point – it’s a collaboration with a designer, and an exploration of a product.

Last year, Apple featured the Tech21 FlexPro Case, which includes a wrist strap or a cross-body strap – I used it for my trip to Spain for my App Intents conference talk.

This year, they’ve now released the Crossbody Strap for iPhone:

The Crossbody Strap is designed to attach to select Apple cases for a convenient and hands-free way to wear iPhone.

Beautifully crafted from 100 percent recycled PET yarns, the smooth, narrow woven straps drape comfortably across the body.

Embedded flexible magnets with stainless steel sliding mechanisms allow you to effortlessly adjust the length for the perfect fit, while keeping both straps securely and neatly aligned.

Perhaps we’ll see something similar from Apple themselves in response to the iPhone Pocket? AirPods Socks?

Read the blog post from Native Union, plus check out their current line of AirPods cases.

Or get the iPhone Pocket or get the Crossbody Strap from Apple, and read the press release from Apple Newsroom.

  1. For an actual explanation, here’s the product description from Apple:

    “iPhone Pocket is a collaboration between Apple and ISSEY MIYAKE. Based upon a mutual respect and shared approach to design, it’s inspired by the concept of “a piece of cloth” and features a singular 3D-knitted construction designed to fully enclose iPhone, while expanding to fit your everyday items.

    Featuring a ribbed mesh structure with the qualities of the original pleats patented by ISSEY MIYAKE, iPhone Pocket is a beautiful way to wear and carry iPhone. When stretched, the open textile subtly reveals its contents and allows you to peek at your iPhone display. Born out of the idea of creating an additional pocket, while also being playful and versatile, iPhone Pocket is available in a short strap length (in eight colors), and a long strap length (in three colors), suitable for a variety of wearing styles – handholding, tying onto bags, or wearing directly on your body.”
Categories
Announcements News

Shortcuts Creator Stephen Robles Goes Solo; Sign Up For His Membership

If you’ve seen any linked posts on my blog, you’ve probably seen Stephen Robles – a prolific content creator who’s made a massive impact in the Shortcuts community the last few years. Just last week, Stephen announced that, thanks to the support of his community, he’s gone solo and left his job to pursue YouTube & his Shortcuts membership full-time.

My congratulations to Stephen – he’s an amazing person, a hard worker, and has given me so value with his Shortcuts work.

Here’s what his membership program offers:

Become a member of the community and gain access to my ShortcutsGPT, Shortcut of the Week, searchable Shortcuts database, priority Shortcuts requests, and more!

Watch his announcement video, follow Stephen on YouTube and sign up for Stephen’s membership program to support his independence.

Categories
Developer Links

Ideally, Apple Intelligence Could Query Your Personal Context »

Jason Snell on the Upgrade podcast:

It’s the idea that there’s a personal data trove that you have. And then you’re asking the model to query the contents. […] You know about all this stuff, now do stuff with it.

And if you can do that on device, it’s really powerful. [I]t’s hard to do that in the cloud because you would actually need to upload it to Private Cloud Compute. And that’s a lot of data. So what you want is some parsing to happen on device.

But that’s the dream, right? Is that your phone knows about your stuff and then the LLM that’s running can make uh distinctions based on your stuff.

And ideally the model could potentially, and I know this is wild, I don’t think they’ve talked about it, but ideally the model could query your personal context, get a bunch of related data out, and then send that in a query to the private cloud and have it process it, right? You could have a kind of cherry picking your personal data model on device that then kicks it to the more powerful model to process it and intuit things about it.

There’s lots of ways you could do this. It’s a great idea. It was a great idea in 2024 when they showed it, but they got to do it – is the challenge there.

In reply to Jason, cohost Myke Hurley said the following:

So, I’m just going to make a little prediction. These[…] things that I’ve spoken about, we will see iOS 27 before [they] ship.

I believe they will have stuff – like, I believe they will have [releases] in the spring like it has been rumored, but I don’t think all of these things.

I think particularly the Personal Context thing… we may never see that.

For what it’s worth Apple has done this and named it Queries. Shortcuts users might better understand this as the Find actions, which allow actions to find and filter data from apps before using it in their shortcuts.

Introduced for developers alongside the App Intents API in 2022, Queries are how intents/actions retrieve entities/data from apps. In their most recent session “Get to know App Intents” from 2025, they explicitly say the following – a phrase that caught my attention in regards to the “new Siri” we’ve been waiting for:

Queries are the way the system can reason about my entities

Apple has also been building out their ability to index and query these entities through their Spotlight support, as well as now Visual Intelligence.

You can learn more about Entity Queries & Indexed Entities, and watch the developer sessions for Get to Know App Intents & Explore new advances in App Intents.

Check out Upgrade #588, follow the show on Apple Podcasts, or watch the video on YouTube.

Categories
Gear Links

Marvel 3D Movies on Apple Vision Pro “Look Better Than Anyone Has Ever Seen Them Before” »

With the news today that Marvel just updated The Fantastic Four: First Steps for 3D on Apple Vision Pro, I was reminded of an old thread from Marvel VFX supervisor Evan Jacobs where he made the following claim:

The 3D Marvel films on the AVP look better than anyone has ever seen them before. The capabilities of the VisionPro are really unique and we remastered all the films for this format.

And:

Our goal was to match the color and brightness of the 2D HDR versions but for 3D. The Vision Pro delivers basically perfect stereo contrast so no ghosting, HDR color, UHD resolution and we did some work on the older titles as well.

Two Reddit threads reference the post, but it appears Jacobs left Twitter and his X account no longer exists – however, I found a direct quote from this Apple Vision Pro forum.

In Disney’s press release at the time, they also said the following:

With 3D movies, Disney’s storytelling will also leap off the screen like never before with remarkable depth and clarity for an unprecedented in-home 3D experience on Disney+ with Apple Vision Pro.

Check out the forum post, view the original press release from Disney, and see the how F4 looks on Vision Pro from Ben Geskin on X:

Categories
Developer Links

How to integrate your app with Visual Intelligence »

From the Apple Developer documentation (break added):

With visual intelligence, people can visually search for information and content that matches their surroundings, or an onscreen object.

Integrating your app with visual intelligence allows people to view your matching content quickly and launch your app for more detailed information or additional search results, giving it additional visibility.

And:

To integrate your app with visual intelligence, the Visual Intelligence framework provides information about objects it detects in the visual intelligence camera or a screenshot. To exchange information with your app, the system uses the App Intents framework and its concepts of app intents and app entities.

When a person performs visual search on the visual intelligence camera or a screenshot, the system forwards the information captured to an App Intents query you implement. In your query code, search your app’s content for matching items, and return them to visual intelligence as app entities. Visual intelligence then uses the app entities to display your content in the search results view, right where a person needs it.

To learn more about a displayed item, someone can tap it to open the item in your app and view information and functionality. For example, an app that allows people to view information about landmarks might show detailed information like hours, a map, or community reviews for the item a person taps in visual search.

Browse the full documentation from the Apple Developer site and learn how to use Visual Intelligence for iPhone.

 

Categories
Gear How To Links

How to use Visual Intelligence on iPhone »

From Apple Support:

Use visual intelligence to quickly learn more about what’s in front of you, whether in your physical surroundings or on your iPhone screen.

To learn more about your physical surroundings using your iPhone camera on models that have the Camera Control, just click and hold it to do things like look up details about a restaurant or business; have text translated, summarized, or read aloud; identify plants and animals; search visually for objects around you; ask questions; and more. […You can also] access visual intelligence by customizing the Action button or Lock Screen, or opening Control Center. See Alternate options to using the Camera Control.

To learn more about the content on your iPhone screen across your apps, simply press the same buttons you use to take a screenshot. You can search visually, ask questions, and take action, like turning a flyer or invite into a calendar event.

I’ve been learning more about now that developers can integrate their app with Visual Intelligence.

View the full piece on the Apple Support site and read more about the Developer documentation.

 

Categories
Links

Mark Gurman on TBPN: How Siri Will Be Powered By Google’s Gemini »

In an effort to put my TBPN shortcuts to good use, I turned on today’s stream for Monday, November 3rd – and happened upon a segment with Mark Gurman, Managing Editor and Chief Correspondent at Bloomberg News:

They discussed how Gurman got started covering Apple, the stories resurfacing this week around the Siri update being powered by Google’s Gemini, and the iPhone 17 lineup.

View the clip on YouTube.

 

Categories
Podcasts

Members-Only Podcast #4: First Dispatch from the New Studio

From episode 4 of my members-only podcast:

Shortcuts news since the betas, my iPhone 17 feedback, going viral, Sky getting acquired to work on ChatGPT, and… App Intents.

This content is marked as members-only – you’ll need a membership to access it.

View the archive of members-only podcast episodes.

Categories
Links

Shortcuts Showdown on The Vergecast »

From Stephen Robles:

David Pierce, host of The Vergecast often complains that Shortcuts is too complicated and not useful. Equally as often, I tell him he’s wrong on social media, but this time I got to do it live! My thanks to David for inviting me on The Vergecast, and I’m pretty sure I won this round.

Here are the chapters:

Check the post on Stephen’s site Beard.FM, check out the episode of The Vergecast (and follow the show on Apple Podcasts), and see the full video on YouTube.

Categories
Shortcuts

Tune In to TBPN for Today’s Technology and Business News (With These Shortcuts)

If you haven’t heard of TBPN, the “Technology Brothers Podcast Network” is an increasingly popular show by hosts Jordi Hayes and John Coogan that covers the major news of the day in the technology and business world – almost like a CNN for Silicon Valley. Streaming live from 11 AM – 2 PM PST every weekday, TBPN is known for high-profile guest interviews, clippable moments shared on social media, and a somewhat-irreverent tone paired with a deep knowledge & passion for the space.

I’m a fan of the way TBPN has given a breath of fresh air to technology coverage, simultaneously innovating on top of cable TV news, video & audio podcasts, and livestream formats in a new media organization for the current era. When things like AI are changing within a single week, the show provides a spotlight for understanding what’s going on as things move so quickly – and they demonstrate a better grasp of how to spread their content than any organization I’ve seen lately. The show has even evolved into a de-facto part of the technology media circuit, where having your startup’s news broken on TBPN is an indicator of success (much like getting coverage on TechCrunch).

With the show’s 3-hour runtime and multiple formats, it’s reasonable that an average listener won’t always engage with the entirety of each show, so it sure would helpful if there was some sort of way to access everything as needed… like a shortcut perhaps.

Me being me, I built a folder of shortcuts for TBPN for Apple’s Shortcuts app. These shortcuts let you listen to the show on Apple Podcasts and Spotify, watch the livestream or put the feed up on your TV, plus follow the team on X. Plus, as you’ll soon learn, the show is heavily sponsored by Ramp, so I created a cheeky shortcut for anyone to learn more from their website.

I also wrote a blog post about the technique for opening the TBPN livestream, which involves adding /live to any YouTube channel URL.

Check out the shortcuts in the TBPN folder of my Shortcuts Library, and view the blog post on YouTube livestreams. Plus, follow TBPN directly on YouTube, Apple Podcasts, Spotify, and X.

Categories
How To

How To See Any YouTube Channel’s Livestreams Immediately

If you’re someone who enjoys watching livestreams on YouTube, you may not know about the permanent redirect for every YouTube channel that takes you straight to their current livestream or recent streams – just add /live to the channel URL.

I recently published a set of shortcuts for TBPN, the tech & business news podcast, which I like to watch live on YouTube occasionally – the shortcut “Watch TBPN Live” uses the /live redirect to the show. In this context, having one URL for both a live show or capturing a recent stream is ideal, because I can jump straight to the full shows – oftentimes the Home tab of YouTube channels are filled with clips or playlists, and this makes it easy to get straight to the latest full streams of any live video podcast.

My YouTube channel URL is https://www.youtube.com/@matthewcassinelli, and the URL for the Live page is technically https://www.youtube.com/@matthewcassinelli/streams. However, adding /live to the channel URL—https://www.youtube.com/@matthewcassinelli/live—creates a redirect that goes to that same streams page when I’m not live – or directly to the current livestream when I’m live on-air.

I love using this when directing people towards my own livestreams, because it provides a single, clean permalink that never changes and can be used in any social media post that’s written ahead of the stream. Once the show is over, I can use the actual video permalink to share the episode with other people, but ahead of time this single /live redirect is ideal for promotion.

Plus, since I can’t help myself, I built a shortcut for livestreams on YouTube that take advantage of this exact capability. My “Open livestreams for this channel” shortcut lets you take any current video URL, scrape the channel URL from its metadata, and redirect you to the livestreams page of that channel – so you can see their latest streams and even tune in immediately if they are live now. Try calling up the shortcut using Type to Siri next time you’re watching a video and check out if the channel does any livestreaming.

Get the shortcut in the YouTube Videos folder in my Shortcuts Library. Plus, follow me on YouTube and check out my livestreams.

 

Categories
Shortcuts

New in the Shortcuts Library: TBPN shortcuts

I’ve just added a new folder to the Shortcuts Library — my set of shortcuts for TBPN, the tech & business news podcast.

Use these shortcuts to watch the video podcast, tune into episodes on the go, and find the show on X – plus check out Ramp, of course:

  • Watch TBPN TV: Takes the RSS feed for TBPuTube channel and opens the most recent item. Includes option to AirPlay to the Apple TV, or opens in full-screen on Mac.
  • Watch TBPN Live: Opens the /live URL of TBPN’s YouTube channel, which redirects to either the current livestream or the page of recent streams.
  • Play TBPN on Apple Podcasts: Finds and plays the latest episode of TBPN in the Apple Podcasts app. Also follows the show if you’re not already.
  • Open TBPN on Spotify: Opens the deep link into Spotify for the TBPN podcast.
  • Open TBPN on X: Opens the link into @TBPN’s profile on X, either on the web or into the mobile app.
  • Open Jordi Hays on X: Opens the X profile for cohost Jordi Hays.
  • Open John Coogan on X: Opens the X profile for cohost John Coogan.
  • Save time and money at Ramp dot com: Opens to the website for Ramp, the main sponsors of TBPN.

Check out the folder of TBPN shortcuts in the Shortcuts Library.

Categories
Links

Creative Neglect: What About the Apps in Apple? »

Joe Rosensteel, writing for Six Colors:

One of the things that I think about from time to time is Apple’s collection of apps. Some are the crown jewels, like Apple’s pro apps, and others help an everyday consumer to tackle their iLife. All are pretty starved for attention and resources, outside of infrequent updates aligned with showing off the native power of Apple Silicon, Apple Intelligence, or demos of platform integration that never quite get all the way there.

Three things really brought this up to the surface for me recently: The neglect of Clips and iMovie, the radio silence regarding Pixelmator/Photomator, and Final Cut Pro being trotted out for demos but not shipping appropriate updates.

I agree with Joe’s sentiment, but direct it more towards—you guessed it—the Shortcuts app than Pixelmator, which I’ve been saying is within a reasonable window for updating – anything they’re working on could only feasibly ship after en entire yearly cycle.

Shortcuts, on the other hand, has been out for over 5 years and still hasn’t evolved too far beyond its original Workflow UX – Six Colors’ own Jason Snell just talked about how Shortcuts is not really that friendly on Monday of this week.

Read the whole story on Six Colors.

 

Categories
Links

OpenAI acquires Software Applications Incorporated, maker of Sky

From the OpenAI company blog:

AI progress isn’t only about advancing intelligence—it’s about unlocking it through interfaces that understand context, adapt to your intent, and work seamlessly. That’s why we’re excited to share that OpenAI has acquired Software Applications Incorporated, makers of Sky.

And:

“We’ve always wanted computers to be more empowering, customizable, and intuitive. With LLMs, we can finally put the pieces together. That’s why we built Sky, an AI experience that floats over your desktop to help you think and create. We’re thrilled to join OpenAI to bring that vision to hundreds of millions of people.” —Ari Weinstein, Co-Founder and CEO, Software Applications Incorporated

Incredible run by my former teammates – first selling Workflow to Apple, and now Sky to OpenAI.

I’m super excited to see their deep talent and passion reflected in the ChatGPT app.

Read the full blog post from OpenAI.

 

Categories
Offsite

Former Apple Engineer Matthew Cassinelli Reveals iWork Powers Internal Operations

Accidentally wrote the perfect tweet – a technically-true point that can be wildly interpreted, which went viral on Twitter (and of course Grok got slightly wrong):

Former Apple engineer1 Matthew Cassinelli disclosed that the company conducts much of its internal operations using its own iWork apps, including Calendar, Contacts, Pages, Numbers, and Keynote, as a key aspect of its dogfooding practice. This revelation, shared on X and garnering over 15,000 engagements, highlights Apple’s commitment to testing products internally to drive improvements, with proprietary backend tools enhancing functionality for its needs. While users praised the apps’ integration and usability in areas like Keynote and Pages, criticisms focused on Contacts’ cumbersome interface and the occasional reliance on tools like Excel for complex tasks.

A few clarifications (you can find in more detail in the replies):

  • I both used and enjoyed these apps before I joined, but I had never seen a whole company committed to them.
  • Apple does not necessarily “conduct much of its internal operations” within iWork – they all have the apps and use them, but there are many other tools in place.
  • Employees are given these apps and use them by default, but do not exclusively use these tools; the people in Finance use Excel, for example, and anything specialized like CAD is also used – in addition to iWork.
  • Contacts and Calendar specifically are buoyed by an internal directory; Mail also has server-side rules that make filtering easy for them and not in the product.
  • I worked at Apple in 2017, so this is outdated – still true at a base level, but they’ve adopted more advanced tools like Slack since then.
  • On a basic level, Apple provides the apps because they make the apps, and it wouldn’t make sense to pay for a second set of tools for every employee while also not using your own freely-available product.
  • I started my YouTube channel because I got Final Cut Pro for free.

Plus, while you’re here – if you’re ever running into speed problems with apps like Contacts or Calendar, you should look into Shortcuts. For example, the new Use Model action for Apple Intelligence makes tasks like processing contact information much easier to build within a few steps.

View the story from Trending on X2.

  1. For what it’s worth, I was a Product Specialist and not an engineer – I studied Business Administration and Marketing before joining Workflow. I became a programmer because of Workflow (now Shortcuts), but I don’t want to misrepresent myself as a former Apple engineer.
  2. Feel free to repost your own wildly-misinterpreted version of my point so I can hit Creator Monetization and get paid.
Categories
Feedback Gear Siri Shortcuts

Dual Capture and Landscape Selfies for iPhone Need Shortcuts Support

On iPhone 17 models, Apple has added new hardware and software updates for advanced Camera features like Dual Capture and Center Stage, which allow capturing footage in more dynamic ways than ever.

Quickly accessing new features like this and forming muscle memory is critical to user adoption & long-term habits, which is why Apple should expand the Camera app’s Shortcuts support to everything new – something I’ve requested directly via the Feedback app in issue FB20772988 (Add Dual Capture and Selfie Video to Camera actions in Shortcuts).

Dual Capture and Selfie Rotate on iPhone 17

With any iPhone 17, you’re now able to capture both front-facing and rear-facing footage at the same time in a Dual Capture experience. This an awesome merging of hardware and software that creates a personal capturing experience I’ve loved since the Frontback days – a memory that says “here’s where I am”, but also “here’s who I am” (and “here’s who I’m with” too).

Plus, the selfie sensor has been expanded to a square size to allow both portrait and landscape capture, enabling features like a Selfie Rotate button to shoot in landscape while holding the phone vertically,1 as well as Center Stage functionality that automatically expands the shot depending on how many people are paying attention in-frame.

On The Stalman Podcast, Apple iPhone Product Manager Megan Nash specifically mentioned that holding the phone vertically created better eye gaze, which is otherwise awkward and often prevents people like me from filming themselves:

“You’ll notice people in the photos have better eye gaze because the camera preview is centered with the front camera, rather than being off to the side when you rotate iPhone to horizontal.”

These are incredible additions to the lineup and the primary reason I was excited to upgrade this year, both of which will make everyday content creation easier and also more dynamic.

Expand Camera’s App Shortcuts Support

I’m proposing that Apple add these features into the Camera app’s Shortcuts support, either in the form of expanded App Shortcuts or an overhaul to the Camera actions.

Currently, in Shortcuts, the Camera app has a single action, Open Camera, that opens the camera in a specified mode. As of writing, you’re able to choose from Selfie, Video, Portrait, Portrait Selfie, Photo, Cinematic, Slo-Mo, Time-Lapse, Pano, Spatial Video, and Spatial Photo.

Crude rendering by yours truly.

The simplest update would be to include options for Dual Capture and Landscape Selfies, allowing a quick addition to existing functionality. This would build upon the curated App Shortcuts experience, and make these new features immediately available via Siri, on the Lock Screen, in Control Center, and on the Action button nicely – the simplest and most likely outcome.

Overhaul Camera’s App Intents Support

However, I propose Apple give the Camera app a deeper App Intents review and consider splitting up the Open Camera action in alignment with the Camera app redesign, building out the longstanding Take Video and Take Photo actions from Workflow and including additional functionality as parameters.

Take Video could include modes (and App Shortcuts) for Video, Cinematic, Slo-Mo, and Timelapse, each with dependent parameters for front-/rear-facing cameras, zoom levels and rotate options, extra features, and video formats. Take Photo could include modes (and App Shortcuts) for Photo, Selfie, Portrait, Spatial, Pano, with the same additional functionality as parameters for each mode2

Adding both options as separate actions would deliver add long-desired functionality to the Camera apps’ existing actions and enable a wide array of creator-focused shortcuts based on hyper-specific shooting modes. Plus, these actions could still be turned into App Shortcuts, enabling everyday users to quickly access Dual Capture or landscape-in-portrait selfies on their new iPhone 17 as needed.

Apple – please make it easier to take landscape selfies!3


If you want to see this update, please duplicate my report4 in the Feedback app to signal to Apple that multiple users want this changed.

  1. FYI according to the Alt Text on the Apple Support website, it is officially called “the Selfie Rotate button.”
  2. There may need to be some slight fudging of “modes” to make a pleasant App Shortcuts experience here, otherwise having both “normal,” “Selfie,” and “Landscape Selfie” versions of each as additional options might be too much – I can see why they might’ve chosen to avoid this route originally. That being said, they should go further with more actions rather than pulling back. ↩︎
  3. There’s got to be a better way to say “enabling landscape selfies while holding iPhone vertically” (from 3:35) – I propose “landscape selfie” as the generic term. ↩︎
  4. On iPhone 17, new Camera modes like Dual Capture and Selfie Rotate let users record from both cameras or film landscape selfies while holding iPhone vertically. These features aren’t available in Shortcuts or App Shortcuts, making them harder to access quickly.The simplest improvement would be adding Dual Capture and Selfie Video options to the existing Open Camera action. Longer term, Camera could gain full App Intents support by splitting Open Camera into Take Photo and Take Video actions with parameters for mode, camera, and format.Results Expected:I am expecting to find all Camera functionality, including Dual Capture, Selfie Video, and future modes, available in the Shortcuts app or App Shortcuts experiences for use from the Lock Screen, Control Center, or Action button.

 

Categories
Links

Jason Snell: Shortcuts Is Not Really That Friendly

From Jason Snell, on Upgrade: An LLM in the Woods:

“It’s like me saying, oh, you know, Shortcuts does a pretty good job of being a consumer user scripting utility.

It’s like, well, yeah, but also really no.”

Plus, later:

“I mean, that’s the bottom line is it’s a great idea. And like I said about Misty Studio1, all things considered, it does a pretty good job of being kind of a friendly face to building an AI model, but in the end, it’s like Shortcuts in that it’s not really that friendly.”

Fair enough – if it truly was, I’d have been out of a job for a long time.

Check out the Upgrade podcast on Apple Podcasts and YouTube.2

  1. For reference, they explained Misty Studio earlier:
    > “Misty Studio is a demo that Apple did for the M5. Misty Studio runs an open-source model locally”
  2. P.S. I apologize in advance to Jason for the URL slug 🙂

 

Categories
Developer Links News

Apple’s Foundation Models Framework Unlocks New App Experiences Powered by Apple Intelligence »

From Apple Newsroom:

With the release of iOS 26, iPadOS 26, and macOS 26 this month, developers around the world are able to bring even more intelligent experiences right into their apps by tapping into the on-device large language model at the core of Apple Intelligence.

The Foundation Models framework allows developers to create new intelligence features that protect users’ privacy and are available offline, all while using AI inference that is free of cost.

You can now add intelligence to your apps for free on Apple platforms – and while it’s relatively simple today… that’s only for now.

View the full article.

Categories
Developer

New App Intents and Apple Intelligence Consulting Availability

App Intents are how Apple devices understand and interact with your app. They’re the foundation of features of like Shortcuts, Siri, Spotlight – and now Apple Intelligence. If you want your app to take advantage of the deepest parts of the Apple ecosystem, it starts with App Intents.

Following recent projects with Foodnoms, MindNode, and Tripsy, I now have availability for App Intents consulting this fall and into 2026. In addition to full start-to-finish projects, I’m introducing new flexible options:

  • Audits: a focused review of your existing intents, data models, and opportunities
  • Docs-only: structured documentation you can use with your team to implement directly

If you want Apple Intelligence to understand your app’s core features, or you want to deploy your app across the system to make a cohesive experience, I can help you design and deliver the following:

  • The unreleased Actions and Context portions of Apple Intelligence
  • App Intents, App Entities, and App Enums for your app
  • Automatically-generated instances of important intents as App Shortcuts
  • Spotlight, Siri, and Controls integrations
  • Custom Shortcuts to be distributed to users
  • Documentation on the new offerings
  • Share ongoing updates in a developer newsletter

Each engagement starts with a free, 1-hour call to asses your needs, discuss budgets and rates, and outline next steps – whether you’re working on a brand, part of a team, or an indie developer, we can find a solution that works for you.

You can learn more about my services, explore past client work, and watch my conference talks on my Consulting. If you’re ready to move forward, book a call with me directly to get started.

Let’s make your app one of the best citizens of the Apple ecosystem – ready for Apple Intelligence, Shortcuts, and beyond.