A picture is worth a thousand words, but you don’t need to type any of them to search the internet these days. Boosted by artificial intelligence, software on your phone can automatically analyze objects live in your camera view or in a photo (or video) to immediately round up a list of search results. And you don’t even need the latest phone model or third-party apps; current tools for Android and iOS can do the job with a screen tap or swipe. Here’s how.
Circle to Search
Google’s Circle to Search feature, released early this year for Android devices, makes firing off a query as easy as dragging your finger around a specific object on the screen — all without switching apps. The feature is available on dozens of Android phones, including many Samsung Galaxy devices. (Galaxy users also have Samsung’s Bixby Vision tool for visual search.)
To use Circle to Search, make sure it is enabled. On Android 15, open the Settings, choose Display & Touch and select Navigation Mode to see the Circle to Search controls. (Steps vary based on the hardware and software involved, but the Settings search box can help you find Circle to Search if you have it.)
Now, when you see something that you want to investigate further, summon Circle to Search by pressing the circular home button at the bottom of the screen (for 3-Button navigation) or by pressing the navigation handle (the horizontal line at the bottom of the screen) for Gesture navigation.
When Circle to Search is activated, the screen dims slightly and a menu of search tools appears. Drag your finger around the onscreen item that interests you, and Google Search results will appear below the image. You can add keywords to narrow the search, and A.I. overviews may be included.
Google has not released an iPhone edition of Circle to Search, but there is a workaround that skips the circling and analyzes a freshly made screenshot of the object in question.
To use it, you need the Google app for iOS and a few minutes with Apple’s free Shortcuts app for automating tasks.
Open the Shortcuts app and tap the plus (+) button in the top-right corner. Go to the Search Actions box. Search for and select the “Take Screenshot” action. Next, look for the “Search Image With Lens” action and add it. Tap the Done button.
You can command the Siri assistant to run the shortcut by its “Search Image With Lens” name. But for silent research requests, you can activate it by touch once you have an image in the camera’s viewfinder or a photo on the screen.
To assign the shortcut to the Action button, available only on iPhone 15 Pro/Pro Max or later models, go to Settings, choose the Action Button, then Choose a Shortcut and select Search Image With Lens.
On older iPhones, go to Settings, select Accessibility, then Touch and Back Tap. In the Back Tap menu, select your choice of Double Tap or Triple Tap and choose the Search Image With Lens shortcut. Tap the back of your phone twice or thrice to run the shortcut and get your search results.
Looking Through Google Lens
The Google Lens image-recognition technology has been providing visual search results since 2017 and can identify many plants, animals, landmarks, artworks and other objects, as well as stuff (like clothing) you may want to buy. It can also translate signs and assist with math homework, among other things.
The software is available in the stand-alone Google Lens app for Android — as well as in the Google app, Google Photos and the Chrome browser (for both Android and iOS).
In the Lens app for Android, tap “Search with your camera” and point the phone at the object you want to search. In other Google apps, tap the square Lens icon onscreen to start searching. The results (often related to marketing efforts) appear below the image.
Using iOS Visual Look Up
Apple’s coming A.I.-powered Visual Intelligence feature for its iPhone 16 models with their Camera Control buttons will supply real-time visual search before the end of the year. Until then (and for older iPhones), there’s the Visual Look Up tool that arrived in 2021. It works with Apple’s Photos app, Safari browser, Quick Look image previews and elsewhere to identify objects.
To use it, open a photo or pause a video. If Visual Look Up is available, the information button at the bottom of the screen shows a leaf, paw print, map pin or other icon you can tap for more information from Apple’s Siri assistant on the object in the image. As with all search tools, check the Google and Apple privacy policies if you have concerns.
The accuracy of visual search results can vary, but for those times when you don’t have the words or time to describe what you’re seeing, the tool might point you in the right direction.
<