Industry News

Google’s John Mueller Recommends For JavaScript sites/Progressive Web Apps

GOOGLE'S JOHN MUELLER RECOMMENDS ON JAVASCRIPT SITES - PROGRESSIVE WEB APPS
2

Google’s John Mueller has shared a post in his Google+ about Google search recommendations for JavaScript sites/Progressive Web Apps. The recommendations are not new but as per Mueller, they will continue to be valid for foreseeable future and so are worth to be noted. Some of the major points from his post are:

  1. Don’t cloak to Googlebot
  2. Use rel=canonical tag
  3. Avoid the AJAX-Crawling scheme
  4. Avoid using “#” in URLs (outside of “#!”)
  5. Ensure that all required resources aren’t blocked by robots.txt
  6. Limit the number of embedded resources in site
  7. JavaScript can be used to provide titles, description & robots meta tags, structured data, and other meta-data
  8. Use sitemap file with correct “lastmod” dates

Here is what he posted in his Google plus account:

An update (March 2016) on the current state & recommendations for JavaScript sites / Progressive Web Apps [1] in Google Search.

We occasionally see questions about what JS-based sites can do and still be visible in search, so here’s a brief summary for today’s state:

# Don’t cloak to Googlebot. Use “feature detection” & “progressive enhancement” [2] techniques to make your content available to all users. Avoid redirecting to an “unsupported browser” page. Consider using a polyfill or other safe fallback where needed. The features Googlebot currently doesn’t support include Service Workers, the Fetch API, Promises, and requestAnimationFrame.

# Use rel=canonical [3] when serving content from multiple URLs is required.

# Avoid the AJAX-Crawling scheme on new sites. Consider migrating old sites that use this scheme soon. Remember to remove “meta fragment” tags when migrating. Don’t use a “meta fragment” tag if the “escaped fragment” URL doesn’t serve fully rendered content. [4]

# Avoid using “#” in URLs (outside of “#!”). Googlebot rarely indexes URLs with “#” in them. Use “normal” URLs with path/filename/query-parameters instead, consider using the History API for navigation.

# Use Search Console’s Fetch and Render tool [5] to test how Googlebot sees your pages. Note that this tool doesn’t support “#!” or “#” URLs.

# Ensure that all required resources (including JavaScript files / frameworks, server responses, 3rd-party APIs, etc) aren’t blocked by robots.txt. The Fetch and Render tool will list blocked resources discovered. If resources are uncontrollably blocked by robots.txt (e.g., 3rd-party APIs) or otherwise temporarily unavailable, ensure that your client-side code fails gracefully.

# Limit the number of embedded resources, in particular the number of JavaScript files and server responses required to render your page. A high number of required URLs can result in timeouts & rendering without these resources being available (e.g., some JavaScript files might not be loaded). Use reasonable HTTP caching directives.

# Google supports the use of JavaScript to provide titles, description & robots meta tags, structured data, and other meta-data. When using AMP, the AMP HTML page must be static as required by the spec, but the associated web page can be built using JS/PWA techniques. Remember to use a sitemap file with correct “lastmod” dates for signaling changes on your website.

# Finally, keep in mind that other search engines and web services accessing your content might not support JavaScript at all, or might support a different subset.

Looking at this list, none of these recommendations are completely new & limited to today — and they’ll continue to be valid for foreseeable future. Working with modern JavaScript frameworks for search can be a bit intimidating at first, but they open up some really neat possibilities to make fast & awesome sites!

I hope this was useful! Let me know if I missed anything, or if you need clarifications for any part.

Links:
[1] PWA: https://developers.google.com/web/progressive-web-apps
[2] Progressive enhancement: https://en.wikipedia.org/wiki/Progressive_enhancement
[3] rel=canonical: https://support.google.com/webmasters/answer/139066
[4] AJAX Crawling scheme: https://developers.google.com/webmasters/ajax-crawling/docs/specification
[5] https://support.google.com/webmasters/answer/6066468

From the past, Google has been making enough effort to find ways to crawls dynamic content. Even if, these recommendations are worth to be noted by Web owners who use JavaScript or other dynamic content in their site.

Renz Joe David plunged into the world of Digital Marketing in 2013 and since then he has been passionate about exploring new areas in SEO. His interests lies more in Link Building, Lead Generation, Social media, Reputation Management, Analytics, Penalty abatement and Marketing tools. Besides profession he likes technology and driving.
Renz Joe David
Renz Joe David plunged into the world of Digital Marketing in 2013 and since then he has been passionate about exploring new areas in SEO. His interests lies more in Link Building, Lead Generation, Social media, Reputation Management, Analytics, Penalty abatement and Marketing tools. Besides profession he likes technology and driving.
You may also like
Wordstream Insider Images
Insider View of WordStream
Facebook Launches Canvas – Interactive Full-Screen Ads
Facebook Launches Canvas – Interactive Full-Screen Ads