Apple Vision Pro Testing: Streaming from Mac

Using the developer strap, I figured out how to capture footage from the Apple Vision Pro in real-time – here's my 1.5+ hour-long livestream.

Last week, in a special livestream for members, I tested using the Apple Vision Pro developer strap to capture my perspective in real-time and share what it’s like to use spatial computing:

Last week, in a special livestream for members, I tested using the Apple Vision Pro developer strap to capture my perspective in real-time and share what it’s like to use spatial computing:

In this video, I’m testing how to capture, record, and stream using the Apple Vision Pro and Ecamm Live.

This method requires the Apple Developer strap, a USB-C cable (preferably extra long), and QuickTime for Mac, plus Ecamm Live and an Apple Vision Pro.

Become a member to get access to access to the video, plus:

  • New shortcuts on an ongoing basis
  • Extra ways to browse the catalog when you’re signed in
  • Prerelease notes & workflows I’m putting together

Posts You Might Like

New in the Shortcuts Library: Journal shortcuts
Check out my folder of shortcuts for Apple's Journal app for iPhone, which has been updated with new actions in the iOS 18 betas.
New in the Shortcuts Library: Accessories shortcuts
Get access to the hidden Toggle Accessory shortcut as a member, found in my Accessories folder of shortcuts for your HomeKit devices and scenes.
OpenClaw Showed Me What the Future of Personal AI Assistants Looks Like »
Federico Viticci dove deep into the OpenClaw project from Peter Steinberger and reported back on his experience – a must-read for the state of AI at the beginning of 2026.
PSA: AI Terminal bots can query macOS Spotlight using mdfind
For anyone using Claude Code or OpenClaw, try querying Spotlight with Apple's own Terminal tool `mdfind.`