Google Lens

Announced last year at I/O, Google is announcing a number of new capabilities during this year’s keynote. The visual search feature is getting a new look and being integrated with third-party camera apps from OEMs. Meanwhile, it is adding three new features, the most notable being real-time answers.
LG officially unveiled its latest flagship smartphone today with the LG G7 ThinQ. The phone offers quite a few new features, but one of its most interesting aspects is how it integrates Google Assistant. Hidden in LG’s press release, though, is an interesting tidbit about Google Lens – we’re getting new features at I/O.
At Mobile World Congress 2018, Google announced a wider rollout for Lens in both Assistant and Photos, and new features like improved support for recognizing animals. This ability to recognize different cat and dog breeds is now going live for users.
With the first Routines launching yesterday, Assistant is now adding another feature that Google announced last month. Meanwhile, text selection in Google Lens for Assistant is now widely available.
Google’s AI is extremely powerful, and there’s no better showcase for that than the Google Assistant. Its most recent trick, Google Lens, debuted shortly after the Pixel 2’s launch, and now it’s finally rolling out to all users. Sort of…
In addition to a bevy of Assistant news ahead of MWC 2018, Google Lens is also gaining a wider release and new features. Meanwhile, per a report yesterday, Google is also bringing ARCore out of beta today and touts 100 million supported devices.
Due to the holiday season, version 7.17 of the Google app first hit the beta channel in late November and rolled out to all users throughout December. Now, version 7.18 has arrived with more hints of the rumored Home with a display, Lens functionality, and more.
Google Lens launched in beta on the company’s Pixel smartphones, so understandably, it’s a bit light when it comes to the feature load. However, Google is continually working to improve it. Recently, one of the leads on the Lens project has revealed some of what we should expect in the coming months…
Google Lens rolled out to Assistant on Pixel and Pixel 2 devices earlier this week and just today gained a translation feature. Meanwhile, Lens in Google Photos was also updated today with a text selection feature, while the app also gained a light navigation bar.
Following last week’s announcement, Google Lens began widely rolling on Monday to Assistant on Pixel and Pixel 2 phones. In a teardown of the Google app yesterday, we spotted that Lens was working on adding the previously announced translation feature, among several others. Today, that functionality is starting to rollout to Google Lens.
Last Tuesday, Google announced that Lens would be widely rolling out to Assistant on Pixel and Pixel 2 devices over the coming weeks. This Monday morning, several reports note that the visual search feature is now more widely available.
Over the weekend, several Pixel and Pixel 2 owners noticed that Google Lens was now available on their devices. Today, Google officially announced the rollout of visual search assistant and that it would be talking place over the coming weeks.
Back in October, several Googlers noted that Google Lens would be coming to Assistant “in the next few weeks.” On Friday evening, the first users have spotted the visual search feature up and running on their Pixel and Pixel 2 phones.
Following a redesign of the share sheet in the previous update, the latest version of Google Photos is rolling out now. Google Lens on the Pixel and Pixel 2 pick up a really neat and Googley animation when activating, while there are minor UI changes, and a photo book-related promotion.
Google Lens has a ton of potential, and the bits Google showed off with it at I/O this May and GDD in September are incredible. However, basically none of that is live right now, and functionality is incredibly limited within the Photos app. Over time it’s going to improve, though, and a couple of Googlers have hit Twitter to give us more information.
At its October 4th event, Google shared more details about Lens including how it would initially launch in Google Photos for Pixel devices. After launching with the Pixel 2 and Pixel 2 XL, Google Lens is beginning to appear for those on last year’s devices as part of a “Pixel preview.”
Much like how Google Assistant was announced at I/O 16 and initially premiered with the Pixel last year, Google Lens is seeing a similar release schedule. These “set of vision based computing capabilities” for performing tasks like visual search is launching first on the Pixel 2 and Pixel 2 XL.
Since I/O 2017, Google has been working on adding Google Lens to Assistant. With version 7.12 of the Google app, we’ve been able to activate Lens and demonstrate what it looks like. Additionally, we were able to initiate the new male Assistant voice that we spotted last week.
Thanks to the last version of the Google app, we learned a great deal about what ‘Bisto’ is, what it does, and even how it works. The latest version of the app began rolling out yesterday and it reveals a number of significant things, including insight into the next version of Android.
Google Lens was one of most exciting announcements of I/O 2017, but we unfortunately have to wait until later this year for it go live. However, signs of it are beginning to show up in the latest Google beta, along with app shortcuts and other minor changes.
At I/O 2017, Sundar Pichai announced Google Lens, a set of vision-based computing capabilities that can understand what you’re looking at and provide actions to interact with the world around you. It will first launch on Google Assistant and Photos…