Categories
Links News

CNBC: New Siri is “still on track to launch 2026” »

From Jennifer Elias at CNBC, in a story titled “Apple’s stock has worst day since April as iPhone maker faces FTC scrutiny, reports of Siri delay”:

Apple told CNBC it is still on track to launch in 2026.

While Apple originally only ever confirmed Siri would launch in 2026, the rumors that the iOS 26.4 beta would give us access became accepted as fact – even to me, to be honest. Now, they’ve reset expectations, but need to stop investor sentiment from getting out of hand.

View the original on CNBC (via 9to5Mac).

Categories
Links News

Amazon Alexa+ AI assistant now available free for Prime members »

From Amazon News:

With an entirely new architecture powered by large language models—from both Amazon Nova and Anthropic—Alexa+ is significantly more powerful, and customers are using it in completely new and different ways. They’ve moved from simple, formulaic requests to much deeper, more complex interactions—they’re streaming more music and having deep conversations about discographies, genres, and artists Alexa recommends just for them; they’re settling dinner table debates with a quick question, exploring complex topics, discussing the news of the day, and having deeper ongoing conversations with Alexa (sometimes over days, because Alexa+ can remember context). They are also interacting with Alexa+ in more places, chatting on the go in the Alexa app, and doing deeper research, planning, and generating content at Alexa.com—overall, customers are interacting with Alexa+ more than twice as much.

We’re finally leaving behind simple voice assistants and getting into the era of proper smart assistants.

View the original.

Categories
Links News

Apple is Hiring a Shortcuts Tech Lead Manager

From Careers at Apple:

Summary

Play a part in the next revolution in human-computer interaction. Contribute to a product that helps users tune their devices, making them more personal. Create groundbreaking technology to provide intelligence around the apps you use every day. Work with the people who created Shortcuts, Siri, and other system features that help millions of people get things done.

Our team is looking for engineers experienced with working on Apple platforms who are passionate about building complex, performant systems that power Apple Intelligence. In this role, you’ll be part of a cross-functional and collaborative team that works on frameworks and systems that interact with first-party apps and system services. You’ll ship code that runs on the devices you use every day and powers products that are critical to the lives of millions of users!

Description

You will primarily be responsible for developing features and driving performance for the internal frameworks and subsystems that enable action running on Apple platforms. As a Tech Lead Manager, you will manage a small team of one or two engineers while actively contributing code and providing technical leadership. This position is ideal for those interested in stepping into management while staying connected to technical work, or for experienced managers looking to balance leadership with hands-on development.

As a strong programmer and a creative problem solver, you will break down interesting technical challenges and create robust, performant solutions. You will work across teams and organizations, building relationships and crafting compelling system features. You finish projects with a keen eye to the details that surprise and delight customers. You are driven by building software that operates in extremely tight tolerances, where the pursuit of quality and the satisfaction of solving challenging technical challenges and constraints fuels your best work. You will also play a crucial role in guiding our existing products, leveraging your ability to anticipate issues before they arrive, and lead development of essential technologies in early stages. You care deeply about software architecture and writing code that is robust and maintainable for the future. You are excited about developing new features, as well as maintaining existing code, fixing bugs, and contributing to overall system design. You know it’s all in the details.

Posted January 22, 2026. Make sure to peep the Pay & Benefits…

View the original.

Categories
News

New ‘HomePad’ launch in spring corroborated by The Information »

From Wayne Ma and Qianer Liu’s report at The Information, quoted by Ryan Christoffel of 9to5Mac:

Apple is also working on a home product featuring a small display, speakers and a robotic swiveling base, designed with a heavy emphasis on AI features. That device could be released as soon as this spring, according to two of the people.

At this point, I am convinced this device will run on App Intents using Interactive Snippets.

View the linked story on 9to5Mac and view the original report on The Information (paywalled).

Categories
Links News

Inside Enchanté, Apple’s AI chatbot for employee productivity »

From Filipe Esposito for Macworld:

Macworld has learned that the company has begun rolling out two new AI-powered apps more broadly to employees in its offices. Both tools are designed for employees to not only test AI capabilities in real-world scenarios, a source said, but also integrate them into their workflow and even help improve Apple Intelligence.

[…]

The first app, called Enchanté, functions as an internal ChatGPT-like assistant for employees. The app can be used to assist employees with ideas, development, proofreading, and even general knowledge answers. The interface looks quite similar to what you see in the ChatGPT app for macOS.

And:

The second internal app, known as Enterprise Assistant, is far more specialized. Built entirely around Apple’s internal large language models (LLMs), Enterprise Assistant acts as a centralized knowledge hub for corporate employees.

View the original.

Categories
Links News

Apple reportedly replacing Siri interface with actual chatbot experience for iOS 27 »

Mark Gurman, quoted by Zac Hall on 9to5Mac:

Apple Inc. plans to revamp Siri later this year by turning the digital assistant into the company’s first artificial intelligence chatbot, thrusting the iPhone maker into a generative AI race dominated by OpenAI and Google.

The chatbot — code-named Campos — will be embedded deeply into the iPhone, iPad and Mac operating systems and replace the current Siri interface, according to people familiar with the plan. Users will be able to summon the new service the same way they open Siri now, by speaking the “Siri” command or holding down the side button on their iPhone or iPad.

More details on the upcoming chatbot version of Siri.

View the quoted story on 9to5Mac and view the original on Bloomberg (paywalled).

 

Categories
Links News

Apple’s Two-Phase AI Rollout – New Siri Soon, Chatbot Later »

From Mark Gurman, quoted by Jason Snell on Six Colors:

The previously promised, non-chatbot update to Siri — retaining the current interface — is planned for iOS 26.4, due in the coming months. The idea behind that upgrade is to add features unveiled in 2024, including the ability to analyze on-screen content and tap into personal data. It also will be better at searching the web.

And:

The chatbot capabilities will come later in the year, according to the people, who asked not to be identified because the plans are private. The company aims to unveil that technology in June at its Worldwide Developers Conference and release it in September.


Jason speculates on the policy shift from Apple that would need to occur for these to run on Google’s servers too.

View the quote on Six Colors and view the original on Bloomberg (paywalled).

Categories
Developer News

OpenClaw Showed Me What the Future of Personal AI Assistants Looks Like »

From Federico Viticci for MacStories:

For the past week or so, I’ve been working with a digital assistant that knows my name, my preferences for my morning routine, how I like to use Notion and Todoist, but which also knows how to control Spotify and my Sonos speaker, my Philips Hue lights, as well as my Gmail. It runs on Anthropic’s Claude Opus 4.5 model, but I chat with it using Telegram. I called the assistant Navi (inspired by the fairy companion of Ocarina of Time, not the besieged alien race in James Cameron’s sci-fi film saga), and Navi can even receive audio messages from me and respond with other audio messages generated with the latest ElevenLabs text-to-speech model. Oh, and did I mention that Navi can improve itself with new features and that it’s running on my own M4 Mac mini server?

If this intro just gave you whiplash, imagine my reaction when I first started playing around with Clawdbot, the incredible open-source project by Peter Steinberger (a name that should be familiar to longtime MacStories readers) that’s become very popular in certain AI communities over the past few weeks.

If he Claude Code craze over winter break wasn’t enough, the project-formerly-known-as-Clawdbot has taken over timelines as the next level of AI interaction on our personal machines – Federico has a great rundown.

View the original.

Categories
Links News

Apple Intelligence Siri is over a year late, but that might be a good thing »

From Michael Burkhardt at 9to5Mac:

All of this time passing means one thing: a lot more people have an Apple Intelligence-capable device.

Everyone who bought any iPhone 16 or iPhone 17 model at all in the past two years (plus anyone who already had an iPhone 15 Pro), will be able to use Apple Intelligence.

*Apple engineer taking this to their boss*: “You see? It’s actually a good thing we were so late.”1

View the original.

  1. Burkhardt is still correct here, and Apple also shouldn’t have been so late with it.
Categories
News

Apple partners with Gemini to power the new Siri »

On Monday, January 12, Google published ajoint statement from Google and Apple:

Apple and Google have entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google’s Gemini models and cloud technology. These models will help power future Apple Intelligence features, including a more personalized Siri coming this year.

After careful evaluation, Apple determined that Google’s Al technology provides the most capable foundation for Apple Foundation Models and is excited about the innovative new experiences it will unlock for Apple users. Apple Intelligence will continue to run on Apple devices and Private Cloud Compute, while maintaining Apple’s industry-leading privacy standards.

View the statement and the post on X.

Categories
Links News

John Giannandrea to retire from Apple »

From Apple Newsroom – quoted in full:

Apple today announced John Giannandrea, Apple’s senior vice president for Machine Learning and AI Strategy, is stepping down from his position and will serve as an advisor to the company before retiring in the spring of 2026. Apple also announced that renowned AI researcher Amar Subramanya has joined Apple as vice president of AI, reporting to Craig Federighi. Subramanya will be leading critical areas, including Apple Foundation Models, ML research, and AI Safety and Evaluation. The balance of Giannandrea’s organization will shift to Sabih Khan and Eddy Cue to align closer with similar organizations.

Since joining Apple in 2018, Giannandrea has played a key role in the company’s AI and machine learning strategy, building a world-class team and leading them to develop and deploy critical AI technologies. This team is currently responsible for Apple Foundation Models, Search and Knowledge, Machine Learning Research, and AI Infrastructure.

Subramanya brings a wealth of experience to Apple, having most recently served as corporate vice president of AI at Microsoft, and previously spent 16 years at Google, where he was head of engineering for Google’s Gemini Assistant prior to his departure. His deep expertise in both AI and ML research and in integrating that research into products and features will be important to Apple’s ongoing innovation and future Apple Intelligence features.

“We are thankful for the role John played in building and advancing our AI work, helping Apple continue to innovate and enrich the lives of our users,” said Tim Cook, Apple’s CEO. “AI has long been central to Apple’s strategy, and we are pleased to welcome Amar to Craig’s leadership team and to bring his extraordinary AI expertise to Apple. In addition to growing his leadership team and AI responsibilities with Amar’s joining, Craig has been instrumental in driving our AI efforts, including overseeing our work to bring a more personalized Siri to users next year.”

These leadership moves will help Apple continue to push the boundaries of what’s possible. With Giannandrea’s contributions as a foundation, Federighi’s expanded oversight and Subramanya’s deep expertise guiding the next generation of AI technologies, Apple is poised to accelerate its work in delivering intelligent, trusted, and profoundly personal experiences. This moment marks an exciting new chapter as Apple strengthens its commitment to shaping the future of AI for users everywhere.

Huge transition for Apple, which has been bleeding AI researchers for months.

View the original.

Categories
Announcements News

Shortcuts Creator Stephen Robles Goes Solo; Sign Up For His Membership

If you’ve seen any linked posts on my blog, you’ve probably seen Stephen Robles – a prolific content creator who’s made a massive impact in the Shortcuts community the last few years. Just last week, Stephen announced that, thanks to the support of his community, he’s gone solo and left his job to pursue YouTube & his Shortcuts membership full-time.

My congratulations to Stephen – he’s an amazing person, a hard worker, and has given me so value with his Shortcuts work.

Here’s what his membership program offers:

Become a member of the community and gain access to my ShortcutsGPT, Shortcut of the Week, searchable Shortcuts database, priority Shortcuts requests, and more!

Watch his announcement video, follow Stephen on YouTube and sign up for Stephen’s membership program to support his independence.

Categories
Developer Links News

Apple’s Foundation Models Framework Unlocks New App Experiences Powered by Apple Intelligence »

From Apple Newsroom:

With the release of iOS 26, iPadOS 26, and macOS 26 this month, developers around the world are able to bring even more intelligent experiences right into their apps by tapping into the on-device large language model at the core of Apple Intelligence.

The Foundation Models framework allows developers to create new intelligence features that protect users’ privacy and are available offline, all while using AI inference that is free of cost.

You can now add intelligence to your apps for free on Apple platforms – and while it’s relatively simple today… that’s only for now.

View the full article.

Categories
Links News Shortcuts

What’s New in Shortcuts for iOS 26 »

From Apple Support:

New in iOS 26, iPadOS 26, macOS 26, watchOS 26, and visionOS 26

This update includes enhancements to the Shortcuts app across all platforms, including new intelligent actions and an improved editing experience. Shortcuts on macOS now supports personal automations that can be triggered based on events such as time of day or when you take actions like saving a file to a folder, as well as new integrations with Control Center and Spotlight.

New Actions (Editor’s note: shortened for sake of space)

  • Freeform
  • Image Playground, requires Apple Intelligence*
  • Mail
  • Measure
  • Messages
  • Screen Time
  • Sports
  • Photos
  • Reminders
  • Stocks
  • Use Model, requires Apple Intelligence*
  • Visual Intelligence, requires Apple Intelligence*
  • Voice Memos
  • Weather
  • Writing Tools, requires Apple Intelligence*

Updated Actions

For those building custom shortcuts, some actions have been updated:

  • “Calculate Expression” can now evaluate expressions that include units, including real time currency conversion rates, temperature, distance, and more
  • “Create QR Code” can now specify colors and styling
  • “Date” can now specify a holiday
  • “Find Contacts” can now filter by relationship
  • ”Transcribe Audio” performance has been improved
  • “Show Content” can now display scrollable lists of items, like calendar events, reminders, and more

Shortcut Editor

For those building custom shortcuts, updates have been made to the shortcut editor:

  • Improved drag and drop and variable selection
  • Over 100 new icon glyphs are now available, including new shapes, transportation symbols, and more
  • Rich previews of calendar events, reminders, and more
  • The ability to choose whether shortcuts appear in Spotlight Search

macOS Improvements

Spotlight

Shortcuts can now accept input, like selected text from an open document, when being run from Spotlight.

Automations

Shortcuts can now be run automatically based on the following triggers:

  • Time of Day (“At 8:00 AM, weekdays”)
  • Alarm (“When my alarm is stopped”)
  • Email (“When I get an email from Jane”)
  • Message (“When I get a message from Mom”)
  • Folder (“When files are added to my Documents folder”)
  • File (“When my file is modified”)
  • External Drive (“When my external drive connects”)
  • Wi-Fi (“When my Mac joins home Wi-Fi”)
  • Bluetooth (“When my Mac connects to AirPods”)
  • Display (“When my display connects”)
  • Stage Manager (“When Stage Manager is turned on”)
  • App (“When ‘Weather’ is opened or closed”)
  • Battery Level (“When battery level rises above 50%”)
  • Charger (“When my Mac connects to power”)
  • Focus (“When turning Do Not Disturb on”)

Control Center

Shortcuts can be added as controls to Control Center and the menu bar, including Run Shortcut, Open App, and Show “Menu Bar” Collection

View the full release notes from Apple Support

Categories
News

Shortcuts gains actions for Apple Intelligence, Messages, and Notes checklists in iOS 26

In iOS 26, Apple is adding a series of exciting new actions to Shortcuts, with a heavy focus on Apple Intelligence including direct access to their Foundation Models with the new Use Model action.

Alongside that, Apple has actions for Writing Tools, Image Playground, and Visual Intelligence, plus the ability to Add Files to Freeform and Notes, Export in Background from the iWork apps, and new Find Conversation & Find Messages actions for the Messages app, among others.

Plus, new updates to current actions—like turning Show Result into Show Content—make existing functionality easier to understand.

Here’s everything that’s new – available now in Public Beta:

Apple Intelligence

 

The major focus of actions in iOS 26 is access to Apple Intelligence, both directly from the Foundation Models and indirectly through pre-built Writing Tools actions and Image Playground actions – plus a simple “Open to Visual Intelligence” action that seems perfectly suited for the Action button.

Use Model

  • Use Model
    • Private Cloud Compute
    • Offline
    • ChatGPT Extension

Writing Tools

  • Make Table from Text
  • Make List from Text
  • Adjust Tone of Text
  • Proofread Text
  • Make Text Concise
  • Rewrite Text
  • Summarize Text

Visual Intelligence

  • Open to Visual Intelligence

Image Playground

  • Create Image

Actions

Apple has added new actions for system apps and features, starting with an interesting Search action that pulls in a set number of results, similar to Spotlight.

Both Freeform and Notes got “Add File” actions, plus you can add directly to checklists in Notes now too. Apple put the Background Tasks to work with exporting from iWork apps, and nice-to-have actions for Sports, Photos, and Weather make it easier to take advantage of those apps.

Particularly nice is Find Conversations and Find Messages, the former of which works well with Open Conversation, and the latter of which is a powerful search tool.

Search

  • Search

Freeform

  • Add File to Freeform

Notes

  • Add File to Notes
  • Append Checklist Item

iWork

  • Export Spreadsheet in Background
  • Export Document in Background
  • Export Presentation in Background

Documents

  • Convert to USDZ

Sports

  • Get Upcoming Sports Events

Photos

  • Create Memory Movie

Messages

  • Find Conversations
  • Find Messages

Weather

  • Add Location to List
  • Remove Location from List

Updated

Apple continues to make Shortcuts actions easier to understand and adopt for new users, making small tweaks like clarifying Show Content and Repeat with Each Item.

Plus, existing actions like Calculate Expression, Translate, and Transcribe have benefitted from system-level improvements:

  • Show Result is now titled Show Content
  • Repeat with Each is now labeled “Repeat with Each Item” once placed
  • Open Board for Freeform now shows as App Shortcuts
  • Calculate Expression can accept real-time currency data
  • Translate has been improved
  • Transcribe has been improved
  • “Use Search as Input” added to Shortcut Input

Coming this Fall

These new actions are available now in Public Beta—install at your own risk—and will be fully available in the fall once iOS 26 releases.

There are also further improvements on the Mac, which gained Automations in Shortcuts—including unique File, Folder, and Drive automations only available on Mac—plus the ability to run actions directly in Spotlight. I’ll cover these in future stories – be sure check the features out if you’re on the betas.

I will update this post if any more actions are discovered in future betas, or if there’s anything I’ve missed here.

P.S. See Apple’s video “Develop for Shortcuts and Spotlight with App Intents” for the example shortcut in the header photo.

Categories
Developer Links News

Apple Supercharges Its Tools and Technologies for Developers to Foster Creativity, Innovation, and Design »

From Apple’s announcements at WWDC:

App Intents lets developers deeply integrate their app’s actions and content with system experiences across platforms, including Siri, Spotlight, widgets, controls, and more.

This year, App Intents gains support for visual intelligence. This enables apps to provide visual search results within the visual intelligence experience, allowing users to go directly into the app from those results. For instance, Etsy is leveraging visual intelligence to enhance the user experience in its iOS app by facilitating faster and more intuitive discovery of goods and products.

“At Etsy, our job is to seamlessly connect shoppers with creative entrepreneurs around the world who offer extraordinary items — many of which are hard to describe. The ability to meet shoppers right on their iPhone with visual intelligence is a meaningful unlock, and makes it easier than ever for buyers to quickly discover exactly what they’re looking for while directly supporting small businesses,” said Etsy CTO Rafe Colburn.

Read the full post from the Apple Newsroom.

Categories
Custom Shortcuts News Shortcuts

Apple Releases “Hold That Thought” Shortcut for Global Accessibility Awareness Day

Today is Global Accessibility Awareness Day (GAAD), which Apple highlighted in their press release showcasing accessibility features coming in the next year – plus a new Accessibility shortcut called Hold That Thought:

New features include Accessibility Nutrition Labels on the App Store, Magnifier for Mac, Braille Access, and Accessibility Reader; plus innovative updates to Live Listen, visionOS, Personal Voice, and more.

Near the end of the release, Apple explains their new shortcut, plus the addition of the previous Accessibility Shortcut to Vision Pro:

The Shortcuts app adds Hold That Thought, a shortcut that prompts users to capture and recall information in a note so interruptions don’t derail their flow. The Accessibility Assistant shortcut has been added to Shortcuts on Apple Vision Pro to help recommend accessibility features based on user preferences.

Here’s how Apple describes the shortcut:

Interruptions can cause you to forget tasks and affect productivity, especially for neurodivergent individuals.

When you run this shortcut, you have two options: Capture and Recall.

Run the shortcut and select Capture to capture a screenshot of what you’re doing, any calendar events in the next hour, current open webpage in Safari (Mac only), and Clipboard contents. You’ll then be prompted to write short notes about what you are doing and what you are about to do. Run the shortcut again and select Recall to find the last created note with all the captured information. All notes will be saved with the title “Hold that thought” and the date and time saved.

Run this shortcut using Siri, or add it to the Control Center, Action button or to the Home Screen for quick access.

I love this idea, and the core concept matches the inspiration for my currently-secret app idea that I teased at the end of my Deep Dish Swift talk.

I do have a few suggestions for improvements to the shortcut, however:

  • Remove the errant space in the Choose From Menu prompt between “Capture” and “or” – it says “Capture or recall last stopping point?”
  • For both “? Capture” and “? Recall” options Choose From Menu, Apple should add Synonyms for “Capture” and “Recall” – the emoji can cause issues when dictating to Siri (in general, I avoid emoji in Menus for this reason).
  • Utilize the “Find Tabs” action for iOS instead of simple not adding any functionality for Safari on mobile; Apple’s use of only “Get Current Safari Tab” for Mac reminds me that they still have not added the set of Safari iOS actions added back in 2022 to macOS, and their absence in this shortcut furthers my belief that these highly-sought actions are deprioritized simply because the team doesn’t use iOS as often and this Mac action is “good enough”.
  • The second “Recall” option just opens the note, but I’d rather see that last item I saved – Apple should have gone further to isolate the recent item and display the recalled information, not just open it again. I tried to Recall from my Apple Watch and the shortcut simply failed.
  • The flow of an alert, a 5-second countdown before a screenshot, and two prompts might be too long for most neurodivergent people to capture information effectively while in the process of being interrupted.

To improve the shortcut as it is today, I’d simply remove the Show Alert and Wait actions, and assign this new shortcut to the Action button – that way you can immediately take a screenshot, then answer the prompts, and move on.

Going further, I’d love to see a new version of this next year once Apple Intelligence ships in full, which utilizes “Get On-Screen Content” and accesses all the data available from apps for Personal Context.

Get “Hold That Thought” for Shortcuts, view the announcement from the Apple Newsroom, and check out past updates from GAAD.

Categories
Announcements Developer Links News

Apple Is Delaying the ‘More Personalized Siri’ Apple Intelligence Features »

On Wednesday, March 5, I posted blog post “New developer APIs hint at ‘Personal Context’ for Apple Intelligence coming in iOS 18.4”. Today Friday, March 7, Apple said “Nope” to John Gruber – here’s the quote from Apple spokesperson Jacqueline Roy from his story on Daring Fireball:

“Siri helps our users find what they need and get things done quickly, and in just the past six months, we’ve made Siri more conversational, introduced new features like type to Siri and product knowledge, and added an integration with ChatGPT. We’ve also been working on a more personalized Siri, giving it more awareness of your personal context, as well as the ability to take action for you within and across your apps. It’s going to take us longer than we thought to deliver on these features and we anticipate rolling them out in the coming year.”

Gruber also gives his analysis of the situation, which you should read in full.

Oh, and those API pages? Gone.

View the whole story on Daring Fireball.

Categories
Developer News

New developer APIs hint at ‘Personal Context’ for Apple Intelligence coming in iOS 18.4

The first look at Personal Context for Apple Intelligence is here as APIs available in the iOS 18.4 developer betas allow apps to further their content for the system to understand. This sets the stage for the most significant update to Siri so far, where all your apps can provide Siri with the available views and content to work with – in a secure and private manner, too.

As first mentioned by Prathamesh Kowarkar on Mastodon, there is now a suite of APIs in beta that associate an app’s unique content, called an entity, with a specific view – this allows Siri to read what’s indexed on-screen and use it with other app’s actions when triggered by a command.

APIs like this are necessary for the coming Siri update to actually do what Apple says Apple Intelligence is capable of – now that the functionality is here, however, it’s up to developers to implement everything to make sure the experience works well.

Here are the new pages:

If these APIs are in beta now, it stands to reason they’ll leave beta after iOS 18.4 releases in full – which means Personal Context might be coming as early as iOS 18.4.

Check out the post from Kowarkar on Mastodon.

Update: Nope – it’s officially delayed.

Categories
Gear Links News

Apple unveils new Mac Studio, the most powerful Mac ever »

From the Apple Newsroom:

Apple today announced the new Mac Studio, the most powerful Mac ever made, featuring M4 Max and the new M3 Ultra chip. The ultimate pro desktop delivers groundbreaking pro performance, extensive connectivity now with Thunderbolt 5, and new capabilities in its compact and quiet design that can live right on a desk. Mac Studio can tackle the most intense workloads with its powerful CPU, Apple’s advanced graphics architecture, higher unified memory capacity, ultrafast SSD storage, and a faster and more efficient Neural Engine.

My M1 Mac mini from 2020 is also way overdue for an upgrade…

View the original.