
Translate with Google Lens
Opens the Google app to the Lens feature and looks for text to translate using the camera.
receives a photo from the Share Sheet ( or asks you to pick one) and passes it to the Google app to the Lens feature, using it as a search input and showing you results based off what it sees.
Opens the Google app to the Lens feature and looks for text to translate using the camera.
Asks you to give a search query, then passes it to the Google app to show results.
Opens the Google app and activates Voice Search, letting you use Google dictation in place of Apple’s before performing a Google Search.
Opens the Google app and activates the Search function, showing the keyboard immediately so you can start typing your query.
Opens the Google app to the Lens function so you can use your phone’s camera to scan the environment around you and perform AI functions like real-time translation of street signs.
Opens the Magnifier app and activates the Detect Items feature set to detect People in the Camera view.
Uses Siri to speak back the most recently-added reminder in the Reminders app.
Opens the deep link into Screen Time in Settings for iOS or the Screen Time preference pane on macOS.
Accesses the Transistor API to retrieve your subscriber data and extract the Total Count.
Opens the mind map I’ve created in MindNode called “Shortcuts”.
Get way deeper into Shortcuts – become a member.