Google Lens, the machine learning-powered image recognition service from Google, can be found in two locations. First as an inconspicuous button inside the Google Photos app, visible when one opens an image, helping users identify things like bar codes, buildings, and books. In addition to just image recognition, it's also capable of taking actions on photos, like adding a contact to your phone through a photograph of a business card and translating pictures of text that's in a different language.
This particular feature, once exclusive to Google's Pixel smartphone lineup, is currently seeing a wide rollout to most Android handsets running the latest version of Google Photos, but the company says it's not quite ready for release in the same app on iOS.
The second place where Lens can be found is inside Google Assistant. The execution is different here, however - once you launch Assistant and hit the Lens icon in the bottom right corner, you can perform all of the same tasks you could in the Photos app - but just by tapping objects in the viewfinder, rather than by using photographs that you've already taken.
Google said this implementation of Lens would be made available to "compatible flagship devices", and it seems that the company has made good on its announcement - you may start seeing the Google Lens button inside Assistant if you own a Samsung Galaxy S8, S8+, S9, S9+ and the Note8 over the coming days. If you don't, you can try installing the latest beta version of the Google app, but given that Google likes to do gradual rollouts of its new features and services, you will likely see the feature appear sooner rather than later by itself.