Categories
Developer Links

Ideally, Apple Intelligence Could Query Your Personal Context »

Jason Snell on the Upgrade podcast:

It’s the idea that there’s a personal data trove that you have. And then you’re asking the model to query the contents. […] You know about all this stuff, now do stuff with it.

And if you can do that on device, it’s really powerful. [I]t’s hard to do that in the cloud because you would actually need to upload it to Private Cloud Compute. And that’s a lot of data. So what you want is some parsing to happen on device.

But that’s the dream, right? Is that your phone knows about your stuff and then the LLM that’s running can make uh distinctions based on your stuff.

And ideally the model could potentially, and I know this is wild, I don’t think they’ve talked about it, but ideally the model could query your personal context, get a bunch of related data out, and then send that in a query to the private cloud and have it process it, right? You could have a kind of cherry picking your personal data model on device that then kicks it to the more powerful model to process it and intuit things about it.

There’s lots of ways you could do this. It’s a great idea. It was a great idea in 2024 when they showed it, but they got to do it – is the challenge there.

In reply to Jason, cohost Myke Hurley said the following:

So, I’m just going to make a little prediction. These[…] things that I’ve spoken about, we will see iOS 27 before [they] ship.

I believe they will have stuff – like, I believe they will have [releases] in the spring like it has been rumored, but I don’t think all of these things.

I think particularly the Personal Context thing… we may never see that.

For what it’s worth Apple has done this and named it Queries. Shortcuts users might better understand this as the Find actions, which allow actions to find and filter data from apps before using it in their shortcuts.

Introduced for developers alongside the App Intents API in 2022, Queries are how intents/actions retrieve entities/data from apps. In their most recent session “Get to know App Intents” from 2025, they explicitly say the following – a phrase that caught my attention in regards to the “new Siri” we’ve been waiting for:

Queries are the way the system can reason about my entities

Apple has also been building out their ability to index and query these entities through their Spotlight support, as well as now Visual Intelligence.

You can learn more about Entity Queries & Indexed Entities, and watch the developer sessions for Get to Know App Intents & Explore new advances in App Intents.

Check out Upgrade #588, follow the show on Apple Podcasts, or watch the video on YouTube.

Categories
Gear Links

Marvel 3D Movies on Apple Vision Pro “Look Better Than Anyone Has Ever Seen Them Before” »

With the news today that Marvel just updated The Fantastic Four: First Steps for 3D on Apple Vision Pro, I was reminded of an old thread from Marvel VFX supervisor Evan Jacobs where he made the following claim:

The 3D Marvel films on the AVP look better than anyone has ever seen them before. The capabilities of the VisionPro are really unique and we remastered all the films for this format.

And:

Our goal was to match the color and brightness of the 2D HDR versions but for 3D. The Vision Pro delivers basically perfect stereo contrast so no ghosting, HDR color, UHD resolution and we did some work on the older titles as well.

Two Reddit threads reference the post, but it appears Jacobs left Twitter and his X account no longer exists – however, I found a direct quote from this Apple Vision Pro forum.

In Disney’s press release at the time, they also said the following:

With 3D movies, Disney’s storytelling will also leap off the screen like never before with remarkable depth and clarity for an unprecedented in-home 3D experience on Disney+ with Apple Vision Pro.

Check out the forum post, view the original press release from Disney, and see the how F4 looks on Vision Pro from Ben Geskin on X:

Categories
Developer Links

How to integrate your app with Visual Intelligence »

From the Apple Developer documentation (break added):

With visual intelligence, people can visually search for information and content that matches their surroundings, or an onscreen object.

Integrating your app with visual intelligence allows people to view your matching content quickly and launch your app for more detailed information or additional search results, giving it additional visibility.

And:

To integrate your app with visual intelligence, the Visual Intelligence framework provides information about objects it detects in the visual intelligence camera or a screenshot. To exchange information with your app, the system uses the App Intents framework and its concepts of app intents and app entities.

When a person performs visual search on the visual intelligence camera or a screenshot, the system forwards the information captured to an App Intents query you implement. In your query code, search your app’s content for matching items, and return them to visual intelligence as app entities. Visual intelligence then uses the app entities to display your content in the search results view, right where a person needs it.

To learn more about a displayed item, someone can tap it to open the item in your app and view information and functionality. For example, an app that allows people to view information about landmarks might show detailed information like hours, a map, or community reviews for the item a person taps in visual search.

Browse the full documentation from the Apple Developer site and learn how to use Visual Intelligence for iPhone.

 

Categories
Gear How To Links

How to use Visual Intelligence on iPhone »

From Apple Support:

Use visual intelligence to quickly learn more about what’s in front of you, whether in your physical surroundings or on your iPhone screen.

To learn more about your physical surroundings using your iPhone camera on models that have the Camera Control, just click and hold it to do things like look up details about a restaurant or business; have text translated, summarized, or read aloud; identify plants and animals; search visually for objects around you; ask questions; and more. […You can also] access visual intelligence by customizing the Action button or Lock Screen, or opening Control Center. See Alternate options to using the Camera Control.

To learn more about the content on your iPhone screen across your apps, simply press the same buttons you use to take a screenshot. You can search visually, ask questions, and take action, like turning a flyer or invite into a calendar event.

I’ve been learning more about now that developers can integrate their app with Visual Intelligence.

View the full piece on the Apple Support site and read more about the Developer documentation.

 

Categories
Links

Mark Gurman on TBPN: How Siri Will Be Powered By Google’s Gemini »

In an effort to put my TBPN shortcuts to good use, I turned on today’s stream for Monday, November 3rd – and happened upon a segment with Mark Gurman, Managing Editor and Chief Correspondent at Bloomberg News:

They discussed how Gurman got started covering Apple, the stories resurfacing this week around the Siri update being powered by Google’s Gemini, and the iPhone 17 lineup.

View the clip on YouTube.

 

Categories
Links

Shortcuts Showdown on The Vergecast »

From Stephen Robles:

David Pierce, host of The Vergecast often complains that Shortcuts is too complicated and not useful. Equally as often, I tell him he’s wrong on social media, but this time I got to do it live! My thanks to David for inviting me on The Vergecast, and I’m pretty sure I won this round.

Here are the chapters:

Check the post on Stephen’s site Beard.FM, check out the episode of The Vergecast (and follow the show on Apple Podcasts), and see the full video on YouTube.

Categories
Links

Creative Neglect: What About the Apps in Apple? »

Joe Rosensteel, writing for Six Colors:

One of the things that I think about from time to time is Apple’s collection of apps. Some are the crown jewels, like Apple’s pro apps, and others help an everyday consumer to tackle their iLife. All are pretty starved for attention and resources, outside of infrequent updates aligned with showing off the native power of Apple Silicon, Apple Intelligence, or demos of platform integration that never quite get all the way there.

Three things really brought this up to the surface for me recently: The neglect of Clips and iMovie, the radio silence regarding Pixelmator/Photomator, and Final Cut Pro being trotted out for demos but not shipping appropriate updates.

I agree with Joe’s sentiment, but direct it more towards—you guessed it—the Shortcuts app than Pixelmator, which I’ve been saying is within a reasonable window for updating – anything they’re working on could only feasibly ship after en entire yearly cycle.

Shortcuts, on the other hand, has been out for over 5 years and still hasn’t evolved too far beyond its original Workflow UX – Six Colors’ own Jason Snell just talked about how Shortcuts is not really that friendly on Monday of this week.

Read the whole story on Six Colors.

 

Categories
Links

OpenAI acquires Software Applications Incorporated, maker of Sky

From the OpenAI company blog:

AI progress isn’t only about advancing intelligence—it’s about unlocking it through interfaces that understand context, adapt to your intent, and work seamlessly. That’s why we’re excited to share that OpenAI has acquired Software Applications Incorporated, makers of Sky.

And:

“We’ve always wanted computers to be more empowering, customizable, and intuitive. With LLMs, we can finally put the pieces together. That’s why we built Sky, an AI experience that floats over your desktop to help you think and create. We’re thrilled to join OpenAI to bring that vision to hundreds of millions of people.” —Ari Weinstein, Co-Founder and CEO, Software Applications Incorporated

Incredible run by my former teammates – first selling Workflow to Apple, and now Sky to OpenAI.

I’m super excited to see their deep talent and passion reflected in the ChatGPT app.

Read the full blog post from OpenAI.

 

Categories
Links

Jason Snell: Shortcuts Is Not Really That Friendly

From Jason Snell, on Upgrade: An LLM in the Woods:

“It’s like me saying, oh, you know, Shortcuts does a pretty good job of being a consumer user scripting utility.

It’s like, well, yeah, but also really no.”

Plus, later:

“I mean, that’s the bottom line is it’s a great idea. And like I said about Misty Studio1, all things considered, it does a pretty good job of being kind of a friendly face to building an AI model, but in the end, it’s like Shortcuts in that it’s not really that friendly.”

Fair enough – if it truly was, I’d have been out of a job for a long time.

Check out the Upgrade podcast on Apple Podcasts and YouTube.2

  1. For reference, they explained Misty Studio earlier:
    > “Misty Studio is a demo that Apple did for the M5. Misty Studio runs an open-source model locally”
  2. P.S. I apologize in advance to Jason for the URL slug 🙂

 

Categories
Developer Links News

Apple’s Foundation Models Framework Unlocks New App Experiences Powered by Apple Intelligence »

From Apple Newsroom:

With the release of iOS 26, iPadOS 26, and macOS 26 this month, developers around the world are able to bring even more intelligent experiences right into their apps by tapping into the on-device large language model at the core of Apple Intelligence.

The Foundation Models framework allows developers to create new intelligence features that protect users’ privacy and are available offline, all while using AI inference that is free of cost.

You can now add intelligence to your apps for free on Apple platforms – and while it’s relatively simple today… that’s only for now.

View the full article.

Categories
Links News Siri Shortcuts

What’s New in Shortcuts for iOS 26 »

From Apple Support:

New in iOS 26, iPadOS 26, macOS 26, watchOS 26, and visionOS 26

This update includes enhancements to the Shortcuts app across all platforms, including new intelligent actions and an improved editing experience. Shortcuts on macOS now supports personal automations that can be triggered based on events such as time of day or when you take actions like saving a file to a folder, as well as new integrations with Control Center and Spotlight.

New Actions (Editor’s note: shortened for sake of space)

  • Freeform
  • Image Playground, requires Apple Intelligence*
  • Mail
  • Measure
  • Messages
  • Screen Time
  • Sports
  • Photos
  • Reminders
  • Stocks
  • Use Model, requires Apple Intelligence*
  • Visual Intelligence, requires Apple Intelligence*
  • Voice Memos
  • Weather
  • Writing Tools, requires Apple Intelligence*

Updated Actions

For those building custom shortcuts, some actions have been updated:

  • “Calculate Expression” can now evaluate expressions that include units, including real time currency conversion rates, temperature, distance, and more
  • “Create QR Code” can now specify colors and styling
  • “Date” can now specify a holiday
  • “Find Contacts” can now filter by relationship
  • ”Transcribe Audio” performance has been improved
  • “Show Content” can now display scrollable lists of items, like calendar events, reminders, and more

Shortcut Editor

For those building custom shortcuts, updates have been made to the shortcut editor:

  • Improved drag and drop and variable selection
  • Over 100 new icon glyphs are now available, including new shapes, transportation symbols, and more
  • Rich previews of calendar events, reminders, and more
  • The ability to choose whether shortcuts appear in Spotlight Search

macOS Improvements

Spotlight

Shortcuts can now accept input, like selected text from an open document, when being run from Spotlight.

Automations

Shortcuts can now be run automatically based on the following triggers:

  • Time of Day (“At 8:00 AM, weekdays”)
  • Alarm (“When my alarm is stopped”)
  • Email (“When I get an email from Jane”)
  • Message (“When I get a message from Mom”)
  • Folder (“When files are added to my Documents folder”)
  • File (“When my file is modified”)
  • External Drive (“When my external drive connects”)
  • Wi-Fi (“When my Mac joins home Wi-Fi”)
  • Bluetooth (“When my Mac connects to AirPods”)
  • Display (“When my display connects”)
  • Stage Manager (“When Stage Manager is turned on”)
  • App (“When ‘Weather’ is opened or closed”)
  • Battery Level (“When battery level rises above 50%”)
  • Charger (“When my Mac connects to power”)
  • Focus (“When turning Do Not Disturb on”)

Control Center

Shortcuts can be added as controls to Control Center and the menu bar, including Run Shortcut, Open App, and Show “Menu Bar” Collection

View the full release notes from Apple Support

Categories
Links

New Apple Intelligence Features Are Available Today »

From Apple Newsroom:

Search and Take Action with Updates to Visual Intelligence

Visual intelligence, which builds on Apple Intelligence, now helps users learn and do more with the content on their iPhone screen. It makes it faster than ever for users to search, take action, and answer questions about the content they’re viewing across their apps.

Users can search for the content on their iPhone screen to find similar images across Google, as well as apps that integrate this experience, such as eBay, Poshmark, Etsy, and more. If there’s an object a user is interested in learning about, like a pair of shoes, they can simply press the same buttons used to take a screenshot and highlight it to search for that specific item or similar objects online. And with ChatGPT, users can ask questions about anything they’re viewing onscreen.

Continue playback of video: Visual Intelligence on iPhone 17 Pro

Updates to visual intelligence help users learn and do more with the content on their iPhone screen.

Visual intelligence enables users to summarize and translate text, as well as add an event from a flyer on their iPhone screen to their calendar, with a single tap.

Users can also take advantage of these capabilities by using visual intelligence with their iPhone camera through Camera Control, the Action button, and in Control Center.

And:

Build Intelligent Shortcuts

Shortcuts help users accomplish more faster, by combining multiple steps from their favorite apps into powerful, personal automations. And now with Apple Intelligence, users can take advantage of intelligent actions in the Shortcuts app to create automations, like summarizing text with Writing Tools or creating images with Image Playground.

Users can tap into Apple Intelligence models, either on device or with Private Cloud Compute to generate responses that feed into the rest of their shortcut, maintaining the privacy of information used in the shortcut. For example, users can create powerful Shortcuts like comparing an audio transcription to typed notes, summarizing documents by their contents, extracting information from a PDF and adding key details to a spreadsheet, and more.

View the full story from Apple Newsroom.

Categories
Apps Gear Links

Stream Deck 7 Adds Virtual Decks, Key Logic, Weather, and App Status

My friends at Elgato have updated the Stream Deck app for Mac for version 7.0 with new features for creating virtual Stream Decks on your computer, key logic so each key have multi-tap abilities, weather updates in a new plugin, and quality-of-life features like showing whether an app is currently open.

Here’s how they describe the updates:

🎛️ Virtual Stream Deck — your on-screen workspace controller

Create unlimited virtual keys, customize actions and layouts, then pin them in place or summon to your cursor. It’s your OS sidekick, making every workflow fast and effortless. It’s Stream Deck on your computer, anywhere you go.

[…]

👇 Key Logic — multi-tap abilities

Assign up to three different actions to a single key using Key Logic. Perform a unique action based on how the key is pressed:

  • Press
  • Double press
  • Press and hold

For example, press to play/pause music, double press to skip tracks, or press and hold to go to the previous track.

[…]

⛅ Weather plugin – stay ahead of the forecast

The new Weather Plugin for Stream Deck puts live weather updates and forecasts at your fingertips, with minimal setup and configuration. Instantly see the sky’s latest mood and plan your day without ever picking up your phone or opening a browser.

[…]

🛠️ Improvements and bug fixes

The Open Application action now displays a green dot when the selected app is running.

You can now configure the Open Application action to either do nothing, close, or force quit the selected app when long-pressed.[…]

Virtual Stream Decks are extremely cool.

Check out the Elgato Stream Deck 7.0 Release Notes and get the Stream Deck from Elgato – be sure to sure use my discount code ZZ-CASSINELLI for 5% off.

Categories
Links

Apple News+ introduces Emoji Game 🍎📰➕ 😀🧩

From the Apple Newsroom:

Today, Apple News+ debuted Emoji Game, an original puzzle that challenges subscribers to use emoji to complete short phrases. Emoji Game is now available in English for Apple News+ subscribers in the U.S. and Canada.

“Emoji Game is the perfect addition to the Apple News+ suite of word and number puzzles, turning the emoji we use every day into a brainteaser that’s approachable and fun,” said Lauren Kern, editor-in-chief of Apple News.

More Apple News shortcuts incoming in 3… 2… 1…

View the full story from Apple.

 

Categories
Developer Links News

Apple Supercharges Its Tools and Technologies for Developers to Foster Creativity, Innovation, and Design »

From Apple’s announcements at WWDC:

App Intents lets developers deeply integrate their app’s actions and content with system experiences across platforms, including Siri, Spotlight, widgets, controls, and more.

This year, App Intents gains support for visual intelligence. This enables apps to provide visual search results within the visual intelligence experience, allowing users to go directly into the app from those results. For instance, Etsy is leveraging visual intelligence to enhance the user experience in its iOS app by facilitating faster and more intuitive discovery of goods and products.

“At Etsy, our job is to seamlessly connect shoppers with creative entrepreneurs around the world who offer extraordinary items — many of which are hard to describe. The ability to meet shoppers right on their iPhone with visual intelligence is a meaningful unlock, and makes it easier than ever for buyers to quickly discover exactly what they’re looking for while directly supporting small businesses,” said Etsy CTO Rafe Colburn.

Read the full post from the Apple Newsroom.

Categories
Gear How To Links

How Fast Should My Internet Be To Stream? »

Elgato has shared a helpful guide for ensuring your internet is fast enough for streaming:

Streaming your content live online is more accessible than ever and also can be equally data hungry. From streaming right from your phone to your followers on TikTok to streaming a professional event in 4K 60fps with High Dynamic Range on YouTube, these all require some amount of bandwidth to get your live content to where it needs to be.

And:

In short, here’s the maximum bitrates supported by the services.

Twitch: 6Mbps (up to 1080p)

YouTube: 40Mbps (Up to 4K)

Also, these are upload speeds – and if you’re using both, it can take even more. I use High Quality Audio and High Quality Video in Ecamm Live, which also add to the network load.

Check out the post from Elgato and get Ecamm Live to stream from your Mac.

Also, be sure to sure use my discount code ZZ-CASSINELLI for 5% off from Elgato.

Categories
Gear How To Links

How To Archive Your Live Stream And Why To Do It »

Elgato has shared a handy guide for making sure you always have a copy of your livestream for later – something that tripped me up when I first started on Twitch:

You just finished a stream full of funny moments, interesting discussion, or a great final stand in a battle royale. But what happened to all that content? Did it just end up in the void or did you make sure to save it for later?

Some streaming services like Twitch.TV have time limits for how long they’ll hold onto your past livestream. If you want those moments to live on, you’ll need to archive them somehow. And in some cases, videos are simply unavailable after the stream due to copyright reasons. If you don’t enable storage of those past streams, it’s been too long, or you listened to some music on stream by accident, those moments could just be history.

If you stream right onto YouTube, and you’re already set as those will automatically be archived as a regular video, as long as you weren’t doing a subathon for over 12 hours. If all you want is for your streams to live on, you’re good to go.

Check out the post from Elgato and follow me on Twitch.

Also, be sure to sure use my discount code ZZ-CASSINELLI for 5% off from Elgato.

Categories
Apps Gear Links

The Stream Deck App Is More Mac-Like in Version 6.9

My friends at Elgato have updated the Stream Deck app for Mac for version 6.9 with new features for launching and closing an app from the same key, paste from clipboard in Text actions, the ability to choose your browser when opening URLs, and quality-of-life improvements specifically to make the app more Mac-like.

Here’s how they describe the updates:

🆕 Open App – From launch to close, all on one key

Launching your favorite apps just got easier. The new Open Application action shows a searchable list of installed programs—no file path hunting required.

And it now works with Windows Store (UWP) apps like Discord, Spotify, and Microsoft Teams.

✨ Pro tip: Press and hold the key to close the app. Quick in, quick out.

[…]

📝 Text Action – Now with Clipboard Paste Mode

The Text action now includes paste mode selection with options for simulating typing or pasting from clipboard. Simulate typing is exactly that—its as if you’re typing the text. Paste from Clipboard is new and text is pasted as if you’ve pressed CTRL/CMD+V. The default mode has been changed to Paste from Clipboard.

Simulate Typing is useful for programs that need inputs, like typing out commands in sim games

Paste from Clipboard is useful for chat applications, when you want to paste a block of text as a single message

[…]

🌐 Website Action – Choose Your Browser

Now you can choose exactly where your Website actions open. Chrome for work, Firefox for dev, Safari for testing—it’s all up to you.

Just pick from your installed browsers in the dropdown. The old “GET request in background” toggle now lives here too, tucked away for power users.

[…]

⬇️ Profiles just got smarter

Importing profiles from Marketplace? Stream Deck 6.9 now helps you hit the ground running. If a profile uses actions from plugins you don’t have, you’ll be prompted to install them automatically—no guesswork, no more question marks.

[…]

🪄 Quality of life improvements

A few small touches make a big difference:

Option to launch Stream Deck on startup

Option to disable automatic update checks

macOS only: Stream Deck now appears in the Dock when open

macOS only: You can now maximize the app using the green button

[…]

Check out the Elgato Stream Deck 6.9 Release Notes and get the Stream Deck from Elgato – be sure to sure use my discount code ZZ-CASSINELLI for 5% off.

Categories
Announcements Developer Links News

Apple Is Delaying the ‘More Personalized Siri’ Apple Intelligence Features »

On Wednesday, March 5, I posted blog post “New developer APIs hint at ‘Personal Context’ for Apple Intelligence coming in iOS 18.4”. Today Friday, March 7, Apple said “Nope” to John Gruber – here’s the quote from Apple spokesperson Jacqueline Roy from his story on Daring Fireball:

“Siri helps our users find what they need and get things done quickly, and in just the past six months, we’ve made Siri more conversational, introduced new features like type to Siri and product knowledge, and added an integration with ChatGPT. We’ve also been working on a more personalized Siri, giving it more awareness of your personal context, as well as the ability to take action for you within and across your apps. It’s going to take us longer than we thought to deliver on these features and we anticipate rolling them out in the coming year.”

Gruber also gives his analysis of the situation, which you should read in full.

Oh, and those API pages? Gone.

View the whole story on Daring Fireball.

Categories
Gear Links News

Apple unveils new Mac Studio, the most powerful Mac ever »

From the Apple Newsroom:

Apple today announced the new Mac Studio, the most powerful Mac ever made, featuring M4 Max and the new M3 Ultra chip. The ultimate pro desktop delivers groundbreaking pro performance, extensive connectivity now with Thunderbolt 5, and new capabilities in its compact and quiet design that can live right on a desk. Mac Studio can tackle the most intense workloads with its powerful CPU, Apple’s advanced graphics architecture, higher unified memory capacity, ultrafast SSD storage, and a faster and more efficient Neural Engine.

My M1 Mac mini from 2020 is also way overdue for an upgrade…

View the original.