Google is on the verge of launching its most recent Augmented Reality (AR) project. This is not as showy as other AR projects in the market, but Google seems to be trying to be legitimately helpful. They are focused on the utility and being of use and assistance to consumers when they came up with embedding AR in Google Search.
"We think, with the technologies coming together in augmented reality, in particular, there's this opportunity for Google to be vastly more helpful," says Clay Bavor, vice president of virtual and augmented reality, about the variety of AR, updates Google has come this year.
With Google Search AR, compatible Android and iOS devices will see 3D object links in Search, which will bring up 3D models that can then be dropped into the real world at proper scale in AR. Google Search will incorporate 3D files using the glTF format, as opposed to Apple's USDZ format used by ARKit in iOS 12. According to Google, developers will need to add just a few lines of code to make 3D assets appear in Google Search.
"Anyone who has the 3D assets, and it turns out a lot of the retail partners do, folks like Wayfair or Lowe's, all they have to do is three lines of code," says Aparna Chennapragada, vice president and general manager for camera and AR products. "The content providers don't have to do much else." Google's already working with NASA, New Balance, Samsung, Target, Visible Body, Volvo and Wayfair to incorporate 3D assets into Google Search. The AR effects launch into a new Android feature called Scene Viewer.
This means you can bring almost everything to life right in front of you, giving you a sense of scale and detail. You can view a dress in 3D see if it really suits you or you can bring an animated tiger into your living room.
When you do your search, if the search result is available in 3D you'll get an option right in the Knowledge Panel to view them in 3D and AR.
Besides all these, major updates in Google Lens were also done. With the use of machine learning and computer vision, Google Lens has evolved to provide more visual answers to visual questions. An example given by Chennapragada is when you're at a restaurant and is figuring out what to order. Lens will automatically highlight its recommendations from the menu itself. Another feature is its automatic language detection and translation. It will overlay the translation right on top of the original words.
All these features in Google Search and Google Lens are some of the ways that Google is trying to connect helpful digital information to things in the physical world, helping its consumers with more visual information.