Similar Items Schema for Google Image Search on Mobile Web and Anroid Google App Launched

New "similar items" search features on mobile web and Android Google Search app rolls out across the globe.

Share online:

Image Search results are now showing "similar items" on mobile web and in the Android Search app —now, when you look at "lifestyle" images, and clicks on one of the image that you like, "using machine vision technology, the feature identifies the products in the images and displays additional matching products."

As on today, "similar items" support is added to products that include: "handbags, sunglasses, and shoes," and in the coming months will expand to cover other apparel and home & garden categories as well, says Google.

The Similar items carousel gets millions of impressions and clicks daily from all over the world. To try it out, just search Google with queries like [designer handbags].

For those, who want their products to be eligible for Similar items, they need to add schema.org on their pages. Specifically, you must:

  • Ensure that the product offerings on your pages have schema.org product markup, including an image reference. Products with name, image, price & currency, and availability meta-data on their host page are eligible for Similar items
  • Test your pages with Google's Structured Data Testing Tool to verify that the product markup is formatted correctly
  • See your images on image search by issuing the query "site:yourdomain.com." For results with valid product markup, you may see product information appear once you tap on the images from your site. It can take up to a week for Googlebot to recrawl your website, wrties Google.

"The schema.org/Product markup helps Google find product offerings on the web and give users an at-a-glance summary of product info."

Here is a screen shot of the similiar items search results:

Similiar items on Google image search
Similiar items on Google image search

Google is currently testing a new way to train its artificial intelligence algorithms using Android phones e'dubbed as "Federated Learning."

More specifically, it's in use currently on Gboard on Android, that use Gboard data from the devices storage to suggest more personalized suggestions to the search query,

When Gboard suggests query, the phone locally stores information about the suggestions and which ones were clicked. Federated Learning then processes this data on-device to suggest more improved search results of Gboard's query on each individual's device. Google will then using "Secure Aggregation protocol" collect all these personalized changes and aggregate them into a single new update to the app for all users.

Beyond Gboard query suggestions, Google says, they're also improving "language models" that power Gboard "based on what you actually type on your phone (which can have a style all its own) and photo rankings based on what kinds of photos people look at, share, or delete."

federated learning on gboard