Skip to main content

The Google Lens app is now available on Google Photos for iOS

pixel 2
Image used with permission by copyright holder

As announced via the official Google Photos Twitter account on March 15, users of Google Photos for iOS can now use Google Lens to analyze and extract text, hyperlinks, and other information from photos.

Originally announced at Google’s I/O event, Google Lens uses machine learning to extract text and hyperlinks from images, along with its ability to identify various landmarks from around the world and a host of other promised abilities. It first launched on Google’s Pixel phones at the tail end of 2017, before being launched for all Android phones in March 2018. As of today, March 16, iOS users can also access the deep learning of the Google Lens by accessing it through the iOS Google Photos app.

Recommended Videos

Starting today and rolling out over the next week, those of you on iOS can try the preview of Google Lens to quickly take action from a photo or discover more about the world around you. Make sure you have the latest version (3.15) of the app.https://t.co/Ni6MwEh1bu pic.twitter.com/UyIkwAP3i9

— Google Photos (@googlephotos) March 15, 2018

Anyone looking to play with Google Lens should make sure that their Google Photos app is updated to the latest version (3.15). Then, open your Google Photos app, open a photo, then tap the Google Lens logo. If you’re struggling to find it, Google has posted a small guide on its support website. Some Twitter users have been complaining that they have not yet been able to access the functionality, and it seems that the update is in the process of rolling out worldwide. It’s also worth noting that Google Lens can only be used if your iOS device’s language is set to English, for the time being.

But what can you do with Google Lens? It’s capable of extracting text from your Google Photos, and while that may not sound impressive, it’s then able to use that text to find businesses, extract hyperlinks, find addresses, or identify books, movies, and games. If you take a picture of a business card, Google Lens will offer to save the information as a new contact, taking some of the fuss out of business networking. Landmarks can also be identified, and information on ratings, tours, and history will be offered as a result.

Use Google Lens to copy and take action on text you see. Visit a website, get directions, add an event to your calendar, call a number, copy and paste a recipe, and more. pic.twitter.com/E4ww2cxVUd

— Google Photos (@googlephotos) March 15, 2018

The Google Photos account has been sharing more than a few ways to make your Google Lens work for you, and while that fact that it’s currently restricted to the Google Photos app on iOS means it’s a bit harder to use in everyday circumstances, it’s a really cool addition, and a great indication of what the future has in store for us.

Mark Jansen
Mobile Evergreen Editor
Mark Jansen is an avid follower of everything that beeps, bloops, or makes pretty lights. He has a degree in Ancient &…
The next iPad mini will get the display upgrade it sorely needs
2024 iPad mini in the hands of a person.

The iPad mini (2024) has only just arrived on the market, but news about the next version is already starting to leak. The latest nugget is good news for everyone except perhaps those who just bought the newest model. According to display expert Ross Young, Apple’s next iPad mini will feature an OLED display. Like previous versions, the current iPad mini has an 8.3-inch LCD.

An OLED display, which is tech that is already available on the iPad Pro (2024) and newer iPhones, offers increased brightness, deeper blacks, and better power efficiency than LCD.

Read more
The iOS 18.2 beta, with new Apple Intelligence features, is here
iOS 18.2 update notification on an iPhone.

Apple has just rolled out the first beta of iOS 18.2, merely a day after seeding a release candidate version of the iOS 18.1 build. The latest beta brings some of the biggest Apple Intelligence features to the table.

The first one is ChatGPT integration. When users bring up Siri and ask it a question the assistant can’t handle, the request will be offloaded to OpenAI’s ChatGPT. “Users are asked before any questions are sent to ChatGPT, along with any documents or photos, and Siri then presents the answer directly,” Apple says.

Read more
Apple Intelligence arrives with iOS 18.1 next week
Apple Intelligence on the Apple iPhone 16 Plus.

Last month, Apple released iOS 18 right before the launch of iPhone 16. This update introduced several new features, but it did not include the most significant one, Apple Intelligence, which Apple had promised at the Worldwide Developers Conference (WWDC) in June. That will change next week.

According to The Wall Street Journal and The Verge, which have examined Apple’s upcoming AirPods Pro 2’s hearing health features, iOS 18.1 will arrive sometime next week. Apple had previously said it would launch before the end of the month, and this seems to confirm that.

Read more