diTii.com Digital News Hub

Sign up with your email address to be the first to know about latest news and more.

I agree to have my personal information transfered to MailChimp (more information)

Aug092018

Google Search Console Releases URL Inspection Tool & More Exciting Features

Google’s new Search Console, which is currently in beta recently received some new feature and updates making it easier for webmasters to get the work done in real-time, include Indexing API for job posting URLs, URL inspection tool, improved accuracy of Index Coverage report, referrer URL specific to images are now available to users around the world.

On June 25th, a new URL inspection tool released to users shows details about how Google crawl, indexes, and processes information about pages directly from its search index.

A most requested feature, when a URL is entered using the new tool will detail on how Google Search sees a specific URL — more specifically, it’ll show the date and status of the last crawl, along with any crawling or indexing errors and the canonical URL for that page.

Additional information, such as, if the page was successfully indexed,  any AMP errors, structured data errors and indexing issues as well as enhancements found on the page, such as linked AMP version or rich results, like Recipes and Jobs, says Google.

Here is a screenshot showing a URL is indexed with valid AMP enhancement:

Google URL Inspection tool: URL indexed with AMP success report

If a page isn’t indexed, the new report will include information about “noindex” robots meta tags and “canonical URL” for the page. In the screenshot below you can see URL is not indexed due to ‘noindex’ meta tag in the HTML:

Google Search Console URL Inspection tool: URL not indexed error report

Google notes, though the new Search Console still in beta, the new issue validation flow in Index Coverage and the AMP report are helping out in fixing issues quicker.

Alongside, URL inspection tool, few more features, and reports were launched in the new Search Console, include:

  • Recipe report help fixing structured data issues affecting recipes rich results. By using task-oriented interface you can test and validate fixes.
  • Performance report now gives more visibility with new Search Appearance filters in Search Analytics, including Web Light and Google Play Instant results.

In addition, some changes to the Google Search Analytics API now let developers request 250,000 rows per request and begin to return 16 months of traffic data —just like Performance report.

“Are you using the Search Console Search Analytics API? We recently increased the max results/request to 25k and put together a guide for getting all your site’s data,” Google tweeted. A new developers guide is also made available to get data out of Google Search Console.

On July 25, Google removed the public version of its URL Submission Tool functionality which allows users to submit any webpage to its search index.

Instead, of this tool, Google recommending to use “Fetch & Submit” tool available through its Search Console. Alternatively, website owners can also notify Google about the new pages via sitemap file. Right after the announcement, the “addurl” page begin redirecting to the Google Search Console login page.

Using the Fetch & Submit tool, site owners can submit up to 10 individual URLs per day to search index. From the Google Search Console, perform the steps below:

  • Perform a fetch (or fetch and render) request for a URL using Fetch as Google.
  • Click “Request indexing” next to the fetch in the fetch history table.
  • Select whether to crawl only that single URL or that URL plus its direct links.
  • Click “Submit” to queue your request.

Google Search Console Fetch & Submit Tool

Google Webmasters said, ”We’ve had to drop the public submission feature.” The company did not say why it was necessary to drop it. Maybe it has to do with spam issues or abuse. Google added, “… but we continue to welcome your submissions using the usual tool in Search Console and through sitemaps directly.”

Alongside, the URL submission form appearing in the SERPs when searching for “submit URL to Google” is gone.

On July 27, an upcoming update rolling out to the referral source URLs for Google Images was announced to help site owners better understand the extent to which images on Google drive site traffic.

“Every day, hundreds of millions of people use Google Images to visually discover and explore content on the web.[…]For webmasters, it hasn’t always been easy to understand the role Google Images plays in driving site traffic. To address this, we will roll out a new referer URL specific to Google Images over the next few months.”

The new Google Image referer URL, which is part of the HTTP header, will indicate the last page a user was on when they clicked to visit the destination web page.

The change as it will happen automatically, don’t need anything to be done by the site owner. However, Google emphasizes that developers, particularly, those who create software to track analyze website traffic, should ingest the new referer URL https://images.google.com and attribute the traffic to Google Images.

Also, Google Analytics will automatically ingest the new referral URL, and will appropriately attribute traffic to Google Images. And, there will be no effect to how data is captured in Search Console, and webmasters will continue to receive an aggregate list of top search queries that drive traffic to their site.

Lastly, the new referer URL will continue to have the same country code top-level domain (ccTLD) as the URL used for searching on Google Images, says Google. This means, usually most visitors worldwide come from images.google.com — due to change of last year that made google.com a default choice for searchers worldwide. However, “some users may still choose to go directly to a country-specific service, such as google.co.uk for the UK. For this use case, the referer uses that country TLD (for example, images.google.co.uk),” explained Google.

The Search Console Index Coverage Report recently finished a switching over to a new more accurate system on August 1st.

The switchover, which began on July 14, upgraded the Index Coverage report to a “new, more accurate system,” and does not reflect any changes in users’ site, said Google. As a result, Google was not able to record Index Coverage data from July 14 to August 1. The data in the Index Coverage report “was estimated from the known August 1 values,” Google wrote.

July 26, Google introduced Indexing API for job posting URLs for timely indexing job content to improve job search experience for job seekers around the world with relevant job opportunities from third-party providers across the web.

Timely indexing of new job content is critical because many jobs are filled relatively quickly. Removal of expired postings is important because nothing’s worse than finding a great job only to discover it’s no longer accepting applications.

Today we’re releasing the Indexing API to address this problem.

Using the new API, a site owner can directly notify Google when job posting pages are added or removed. Google said, this results in scheduling job postings for a “fresh crawl, which can lead to higher quality user traffic and job applicant satisfaction.”

The Indexing API currently only support job posting pages that include job posting structured data.

To see how the Indexing API works, Google published this quickstart guide targeted to developers and webmasters.

July 9, an announcement stating that beginning in July 2018, “page speed will be a ranking factor for mobile searches.” The “Speed Update” took six months before it rolled out to all users will affect the mobile search ranking. “It will only impact pages that deliver the slowest experience to users and will only affect a ‘small percentage of queries’,” wrote Google.

“It applies the same standard to all pages, regardless of the technology used to build the page. The intent of the search query is still a very strong signal, so a slow page may still rank highly if it has great, relevant content,” added Google.

This update only impacts the slowest of sites on the internet, even though there is no specific metric or tool that directly indicates whether a page is affected by this new ranking factor. Google, however, uses a number of ways to measure page speed and suggested webmasters look at the metrics from the “Chrome User Experience report, the Lighthouse tool and PageInsights tool.”

In a Webmaster Central Hangout, John Mueller,  warns site owners using “meta refresh” as a redirect may lead to the wrong content getting indexed

Responding to question, “Some sites are using meta refresh after 5 seconds and redirecting the user to a payment page from the content. In this case, does it impact their ranking? I still see their pages indexed with content behind payment, and Google User can’t see the content. What is Google’s recommendation here?”

Mueller says Google will assume the page is refreshed to is the one that should be indexed.

Mueller also recently in a tweet asked for a feedback from SEOs, about “whether site owners should fear a loss in traffic when changing domains.” More specifically, he asked “If you’ve done a domain change, how did it go?; Are the fears well-founded?; Does it help to plan and follow-through systematically?.”

Update 08/16: Google has ported three features from their classic Webmaster Tools to the new Search Console, includes:

  • Managing users and permissions
  • Adding sites and validating ownership
  • Mobile Usability report

 

Share This Story, Choose Your Platform!