diTii.com Digital News Hub

Sign up with your email address to be the first to know about latest news and more.

I agree to have my personal information transfered to MailChimp (more information)

May092018

Google Lens Adds Real-time Answers, Google Photos Suggested Actions

Google Lens , which was integrated into the Google Photos and the Google Assistant last year help people get answer about their question which are otherwise difficult to describe in a search box, like “what type of dog is that?”—today received support to answer question about the world.

As unveiled at the I/O, Lens will now be available directly in the camera app, along with three updates that enable Lens to answer more questions, about more things, more quickly. The Lens and camera app integration will work on devices from LGE, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, Asus, and of course Google Pixel.

Here is what’s new in Lens:

Google Lens is now smart to answer question with style match, for example just open Lens to get information on outfit or home decor item—and Lens will promptly provide you info on that specific item including reviews and also it will now offer similar style that fit the look your like.

A GIF image below shows Lens style match for clothing in a Phone:

Google Lens Style Match

Lens now leverages state-of-the-art machine learning (ML) to proactively surface information in real time. You just point the camera to an item and instantly start browsing the information. Lens combines ML with both on device intelligence and cloud TPUs to identify billions of words, phrases, places and things in just a split of second.

Animation below shows Lens real-time element identification:

Google Lens Rea-time Item Identification

Lens is also now capable of getting you answer through new ‘smart text seletion’ feature—you can copy and paste text from the real world to your phone and Lens will show relevant information and photos.

Smart text selection connects the words you see with the answers and actions you need. You can copy and paste text from the real world—like recipes, gift card codes, or Wi-Fi passwords—to your phone. Lens helps you make sense of a page of words by showing you relevant information and photos.

As an example if you see name of a dish but don’t recognizes it, Lens now show you a picture of it by just copy and pasting the dish name into Lens. The tech behind this “not just require recognizing shapes of letters, but also the meaning and context behind the words,” explains Google.

A GIF shows Lens copy and paste feature:

Google Lens Smart Copy and Paste Feature

All of these Google Lens features will start rolling out over the next few weeks around the world to users.

One-tap actions and more places to experience Google Photos

Three years in, and we’re as excited as ever about making

Google Photos an hub designed to organize pictures and videos on mobile devices today received new functionalities powered by machine learning (ML) to help you get even more, like you can now take action on photos right as you view them.

Starting today, a range of new suggested actions will appear on photos right as you’re viewing, such as option to brighten, share, rotate or archive a picture. Photos app uses ML intelligence to suggested these actions only on relevant photos. To complete the action, you need to tap a suggestion.

Google added people are looking at 5 billion pictures in Google Photos every day.

In the GIF animations below you can new suggested actions:

Google Photos Suggested Actons Main

Google Photos Suggested Actions-Send to

Another feature that by using the artificial intelligence (AI) show photos in a new light. Now using AI Photos app will detect the subject of a photo and leaves it along with their clothing and anything in their hands in color, while the background will set to black and white. These new AI-powered creations can be seen in the Assistant tab of Google Photos.

Google Phots in Black and white background light

Additionally, Google is also working on an ability that help change black-and-white shots into color in just a tap.

Lastly, through new photos partner program, Google is giving developers a new ablitiy to support Google Photos app in their products, so people can choose to access their photos whenever they need them—say through a connected photo frame.

A new Android app ‘Lookout,’ coming to the Play Store in the U.S this year wll aims to provide people with abilities, such as blind or visually impaired auditory cues as they encounter objects, text and people around them.

Google suggests these people to wear a Pixel device in a lanyard around their neck, or in a shirt’s pocket, with camera pointing away from the body.

You need to open Lookout app, and select a mode for the app to start processing items of importance to deliver relevant information to the selected activity through spoken notifications, designed to be used with minimal interaction.

The app currently offer to choose from following four modes within the app: Home, Work & Play, Scan or Experimental (for test out experimental features).

Google explains, “If you’re getting ready to do your daily chores you’d select “Home” and you’ll hear notifications that tell you where the couch, table or dishwasher is. It gives you an idea of where those objects are in relation to you, for example “couch 3 o’clock” means the couch is on your right. If you select “Work & Play” when heading into the office, it may tell you when you’re next to an elevator, or stairwell.”

Google adds becase the app uses machine learning to learn what people are interested in hearing about will learn more to delive better results as more people uses the app.

Wath this video to see the Lookout app in action:

Share This Story, Choose Your Platform!

Get Latest News

Subscribe to Digital News Hub

Get our daily newsletter about the latest news in the industry.
First Name
Last Name
Email address
Secure and Spam free...