The evergreen Chromium renderer

At Google I/O this year we were happy to announce the new evergreen Googlebot.

At its core the update is a switch from Chrome 41 as the rendering engine to the latest stable Chromium. Googlebot is now using the latest stable Chromium to run JavaScript and render pages. We will continue to update Googlebot along with the stable Chromium, hence we call it "evergreen".

Comparison between the rendering of a JS-powered website in the old and new Googlebot
A JavaScript-powered demo website staying blank in the old Googlebot but working fine in the new Googlebot.

What this means for your websites

We are very happy to bring the latest features of the web platform not only to Googlebot but to the tools that let you see what Googlebot sees as well. This means websites using ES6+, Web Components and 1000+ new web platform features are now rendered with the latest stable Chromium, both in Googlebot and our testing tools.
A comparison showing the old and the new mobile-friendly test. The old mobile-friendly test rendered a blank page and the new one renders the page correctly
While the previous version of the mobile-friendly test doesn't show the page content, the new version does.

What the update changes in our testing tools

Our testing tools reflect how Googlebot processes your pages as closely as possible. With the update to the new Googlebot, we had to update them to use the same renderer as Googlebot.

The change will affect the rendering within the following tools:
We tested these updates and based on the feedback we have switched the tools listed previously to the new evergreen Googlebot. A lot of the feedback came from Googlers and the community. Product Experts and Google Developer Experts helped us make sure the update works well.

Note: The new Googlebot still uses the same user agent as before the update. There will be more information about an update to the user agent in the near future. For now, Googlebot's user agent and the user agent used in the testing tools does not change.

We are excited about this update and are looking forward to your feedback and questions on Twitter, the webmaster forum or in our webmaster office hours.

The AJAX crawling scheme was introduced as a way of making JavaScript-based webpages accessible to Googlebot, and we've previously announced our plans to turn it down. Over time, Google engineers have significantly improved rendering of JavaScript for Googlebot. Given these advances, in the second quarter of 2018, we'll be switching to rendering these pages on Google's side, rather than on requiring that sites do this themselves. In short, we'll no longer be using the AJAX crawling scheme.

As a reminder, the AJAX crawling scheme accepts pages with either a "#!" in the URL or a "fragment meta tag" on them, and then crawls them with an "?_escaped_fragment_=" in the URL. That escaped version needs to be a fully-rendered and/or equivalent version of the page, created by the website itself.

With this change, Googlebot will render the #! URL directly, making it unnecessary for the website owner to provide a rendered version of the page. We'll continue to support these URLs in our search results.

We expect that most AJAX-crawling websites won't see significant changes with this update. Webmasters can double-check their pages as detailed below, and we'll be sending notifications to any sites with potential issues.

If your site is currently using either #! URLs or the fragment meta tag, we recommend: