Google to Stop Natively Render AJAX-crawling Scheme in Q2 2018

In second quarter of 2018, Google will be switching rendering of AJAX-crawling pages on its side, rather than requiring sites do this themselves.

Share online:

Googlebot AJAX-crawling Diagram

Google will stop using old AJAX crawling scheme that was introduced as a way of making JavaScript-based webpages accessible to Googlebot, as it will switch to render these #! URL pages directly on own side begining in the second quarter of 2018. "we'll no longer be using the AJAX crawling scheme," said google.

With this change in rendereing scheme, website owner will no more need to provide a rendered version of the page themselves. Howerver, Google will continue to support these URLs in search results.

For those new, "the AJAX crawling scheme accepts pages with either a "#!" in the URL or a "fragment meta tag" on them, and then crawls them with an "?_escaped_fragment_=" in the URL. That escaped version needs to be a fully-rendered and/or equivalent version of the page, created by the website itself."

If your site is currently using either #! URLs or the fragment meta tag, Google said that most "AJAX-crawling websites won't see significant changes with this update." And if, any potential issues happen a notification will be send to sites affected.

Google also recommend webmasters to check their pages as detailed below:

  • Verify ownership of the website in Google Search Console to gain access to the tools there, and to allow Google to notify you of any issues that might be found.
  • Test with Search Console's Fetch & Render. Compare the results of the #! URL and the escaped URL to see any differences. Do this for any significantly different part of the website. Check our developer documentation for more information on supported APIs, and see our debugging guide when needed.
  • Use Chrome's Inspect Element to confirm that links use "a" HTML elements and include a rel=nofollow where appropriate (for example, in user-generated content)
  • Use Chrome's Inspect Element to check the page's title and description meta tag, any robots meta tag, and other meta data. Also check that any structured data is available on the rendered page.
  • Content in Flash, Silverlight, or other plugin-based technologies needs to be converted to either JavaScript or "normal" HTML, if their content should be indexed in search.