This list is only a small part of the agenda that we think is useful to webmasters and SEOs. There are many more sessions that you could find interesting! To learn about those other talks, check out the full list of “web” sessions, design sessions, Cloud sessions, machine learning sessions, and more. Use the filtering function to toggle the sessions on and off.
We hope you can make the time to watch the talks online, and participate in the excitement of I/O ! The videos will also be available on Youtube after the event, in case you can't tune in live.
Posted by Vincent Courson, Search Outreach Specialist
Dozens upon dozens of talks will take place over the next 3 days. We have hand picked the talks that we think will be the most interesting for webmasters and SEO professionals. Each link shared will bring you to pages with more details about each talk, and you will find out how to tune in to the live stream. All times are California time (PCT). We might add other sessions to this list.
Tuesday, May 8th
3pm - Web Security post Spectre/Meltdown, with Emily Schechter and Chris Palmer - more info.
5pm - Dru Knox and Stephan Somogyi talk about building a seamless web with Chrome - more info.
Wednesday, May 9th
9.30am - Ewa Gasperowicz and Addy Osmani talk about Web Performance and increasing control over the loading experience - more info.
10.30am - Alberto Medina and Thierry Muller will explain how to make a WordPress site progressive - more info.
11.30am - Rob Dodson and Dominic Mazzoni will cover "What's new in web accessibility" - more info.
3.30pm - Michael Bleigh will introduce how to leverage AMP in Firebase for a blazing fast website - more info.
4.30pm - Rick Viscomi and Vinamrata Singal will introduce the latest with Lighthouse and Chrome UX Report for Web Performance - more info.
Thursday, May 10th
8.30am - John Mueller and Tom Greenaway will talk about building Search-friendly JavaScript websites - more info.
9.30am - Build e-commerce sites for the modern web with AMP, PWA, and more, with Adam Greenberg and Rowan Merewood - more info.
12.30pm - Session on "Building a successful web presence with Google Search" by John Mueller and Mariya Moeva - more info.
We hope you can make the time to watch the talks online, and participate in the excitement of I/O ! The videos will also be available on Youtube after the event, in case you can't tune in live.
Posted by Vincent Courson, Search Outreach Specialist, and the Google Webmasters team
The current Lighthouse Chrome extension contains an initial set of SEO audits which we’re planning to extend and enhance in the future. Once we're confident of its functionality, we’ll make the audits available by default in the stable release of Chrome Developer Tools.
We hope you find this functionality useful for your current and future projects. If these basic SEO tips are totally new to you and you find yourself interested in this area, make sure to read our complete SEO starter-guide! Leave your feedback and suggestions in the comments section below, on GitHub or on our Webmaster forum.
Happy auditing!
Posted by Valentyn, Webmaster Outreach Strategist.
Use server-side or hybrid rendering so users receive the content in the initial payload of their web request.
Always ensure your URLs are independently accessible:
https://www.example.com/product/25/
The above should deep link to that particular resource.
If you can’t support server-side or hybrid rendering for your Progressive Web App and you decide to use client-side rendering, we recommend using the Google Search Console “Fetch as Google tool” to verify your content successfully renders for our search crawler.
Don’t:
Don't redirect users accessing deep links back to your web app's homepage.
Additionally, serving an error page to users instead of deep linking should also be avoided.
Provide Clean URLs
Why? Fragment identifiers (#user/24601/ or #!user/24601/) were an effective workaround for browsers to AJAX new content from a server without reloading the page. This design is known as client-side rendering.
However, the fragment identifier syntax isn’t compatible with some web tools, frameworks and protocols such as Facebook’s Open Graph protocol.
The History API enables us to update the URL without fragment identifiers while still fetching resources asynchronously and therefore avoiding page reloads -- it’s the best of both worlds. The AJAX crawling scheme (with its #! / escaped-fragment URLs) made sense at its time, but is now no longer recommended.
Provide clean URLs without fragment identifiers (# or #!) such as:
https://www.example.com/product/25/
If using client-side or hybrid rendering be sure to support browser navigation with the History API.
Avoid:
Using the #! URL structure to drive unique URLs is discouraged:
https://www.example.com/#!product/25/
It was introduced as a workaround before the advent of the History API. It is considered a separate pattern to the purely # URL structure.
Don’t:
Using the # URL structure without the accompanying ! symbol is unsupported:
https://www.example.com/#product/25/
This URL structure is already a concept in the web and relates to deep linking into content on a particular page.
Specify Canonical URLs
Why? The best way to eliminate confusion for indexing when the same content is available under multiple URLs (be it the same or different domains) is to mark one page as the canonical, and all other pages that duplicate that content to refer to it.
Best Practice:
Include the following tag across all pages mirroring a particular piece of content:
If you are supporting Accelerated Mobile Pages be sure to correctly use its counterpart rel=”amphtml” instruction as well.
Avoid:
Avoid purposely duplicating content across multiple URLs and not using the rel="canonical" link element.
For example, the rel="canonical" link element can reduce ambiguity for URLs with tracking parameters.
Don’t:
Avoid creating conflicting canonical references between your pages.
Design for Multiple Devices
Why? It’s important that all your users get the best experience possible when viewing your website, regardless of their device.
Make your site responsive in its design -- fonts, margins, paddings, buttons and general design of your site should scale dynamically based on screen resolutions and device viewports.
Small images scaled up for desktop or tablet devices give a poor experience. Conversely, super high resolution images take a long time to download on mobile phones and may impact mobile scroll performance.
Use “srcset” attribute to fetch different resolution images for different density screens to avoid downloading images larger than the device’s screen is capable of displaying.
Scale your font size and line height to ensure your text is legible no matter the size of the device. Similarly ensure the padding and margins of elements also scale sensibly.
Don't show different content to users than you show to Google. If you use redirects or user agent detection (a.k.a. browser sniffing or dynamic serving) to alter the design of your site for different devices it’s important that the content itself remains the same.
Use the Search Console “Fetch as Google” tool to verify the content fetched by Google matches the content a user sees.
For usability reasons, avoid using fixed-size fonts.
Develop Iteratively
Why? One of the safest paths to take when adding features to a web application is to make changes iteratively. If you add features one at a time you can observe the impact of each individual change.
Alternatively many developers prefer to view their progressive web application as an opportunity to overhaul their mobile site in one fell swoop -- developing the new web app in an isolated environment and swapping it with their existing mobile site once ready.
When developing features iteratively try to break the changes into separate pieces. For example, if you intend to move from server-side rendering to hybrid rendering then tackle that as a single iteration -- rather than in combination with other features.
Both approaches have their own pros and cons. Iterating reduces the complexity of dealing with search indexability as the transition is continuous. However, iterating might result in a slower development process and potentially a less innovative overhaul if development is not starting from scratch.
In either case, the most sensitive areas to keep an eye on are your canonical URLs and your site’s robots.txt configuration.
Best Practice:
Iterate on your website incrementally by adding new features piece by piece.
For example, if don’t support HTTPS yet then start by migrating to a secure site.
Avoid:
If you’ve developed your progressive web app in an isolated environment, then avoid launching it without checking the rel-canonical links and robots.txt are setup appropriately.
Ensure your rel-canonical links point to the real site and that your robots.txt configuration allows crawlers to crawl your new site.
Don’t:
It’s logical to prevent crawlers from indexing your in-development site before launch but don’t forget to unblock crawlers from accessing your new site when you launch.
Use Progressive Enhancement
Why? Wherever possible it’s important to detect browser features before using them. Feature detection is also better than testing for browsers that you believe support a given feature.
A common bad practice in the past was to enable or disable features by testing which browser the user had. However, as browsers are constantly evolving with features this technique is strongly discouraged.
Service Worker is a relatively new technology and it’s important to not break compatibility in the pursuit of progress -- it's a perfect example of when to use progressive enhancement.
Best Practice:
Before registering a Service Worker check for the availability of its API:
if ('serviceWorker' in navigator) {
...
Use per API detection method for all your website’s features.
Don’t:
Never use the browser’s user agent to enable or disable features in your web app. Always check whether the feature’s API is available and gracefully degrade if unavailable.
Avoid updating or launching your site without testing across multiple browsers! Check your site analytics to learn which browsers are most popular among your user base.
Test with Search Console
Why? It’s important to understand how Google Search views your site’s content. You can use Search Console to fetch individual URLs from your site and see how Google Search views them using the “Crawl > Fetch as Google“ feature. Search Console will process your JavaScript and render the page when that option is selected; otherwise only the raw HTML response is shown
Google Search Console also analyses the content on your page in a variety of ways including detecting the presence of Structured Data, Rich Cards, Sitelinks & Accelerated Mobile Pages.
Best Practice:
Monitor your site using Search Console and explore its features including “Fetch as Google”.
Provide a Sitemap via Search Console “Crawl > Sitemaps” It can be an effective way to ensure Google Search is aware of all your site’s pages.
Annotate with Schema.org structured data
Why?Schema.org structured data is a flexible vocabulary for summarizing the most important parts of your page as machine-processable data. This can be as general as simply saying that a page is a NewsArticle, or as specific as detailing the location, band name, venue and ticket vendor for a touring band, or summarizing the ingredients and steps for a recipe.
The use of this metadata may not make sense for every page on your web application but it’s recommended where it’s sensible. Google extracts it after the page is rendered.
There are a variety of data types including “NewsArticle”, “Recipe” & “Product” to name a few. Explore all the supported data types here.
Check that the data you provided is appearing and there are no errors present.
Don’t:
Avoid using a data type that doesn’t match your page’s actual content. For example don’t use “Recipe” for a T-Shirt you’re selling -- use “Product” instead.
Annotate with Open Graph & Twitter Cards
Why? In addition to the Schema.org metadata it can be helpful to add support for Facebook’s Open Graph protocol and Twitter rich cards as well.
These metadata formats improve the user experience when your content is shared on their corresponding social networks.
If your existing site or web application utilises these formats it’s important to ensure they are included in your progressive web application as well for optimal virality.
Don’t forget to include these formats if your existing site supports them.
Test with Multiple Browsers
Why? Clearly from a user perspective it’s important that a website behaviors the same across all browsers. While the experience might adapt for different screen sizes we all expect a mobile site to work the same on similarly sized devices whether it’s an iPhone or an Android mobile phone.
While the web can be perceived as fragmented due to number of browsers in use around the world, this variety and competition is part of what makes the web such an innovative platform. Thankfully, web standards have never been more mature than they are now and modern tools enable developers to build rich, cross browser compatible websites with confidence.
Why? The faster a website loads for a user the better their user experience will be. Optimizing for page speed is already a well known focus in web development but sometimes when developing a new version of a site the necessary optimizations are not considered a high priority.
When developing a progressive web application we recommend measuring the performance of your page load speed and optimizing before launching the site for the best results.
Best Practice:
Use tools such as Page Speed Insights and Web Page Test to measure the page load performance of your site. While Googlebot has a bit more patience in rendering, research has shown that 40% of consumers will leave a page that takes longer than three seconds to load..
Avoid leaving optimization as a post-launch step. If your website’s content loads quickly before migrating to a new progressive web application then it’s important to not regress in your optimizations.
We hope that the above checklist is useful and provides the right guidance to help you develop your Progressive Web Applications with indexability in mind.
Poor usability can diminish the benefits of a fast page load. We know the average mobile page takes more than 7 seconds to load, and by using the PageSpeed Insights tool and following its speed recommendations, you can make your page load much faster. But suppose your fast mobile site loads in just 2 seconds instead of 7 seconds. If mobile users still have to spend another 5 seconds once the page loads to pinch-zoom and scroll the screen before they can start reading the text and interacting with the page, then that site isn’t really fast to use after all. PageSpeed Insights’ new User Experience rules can help you find and fix these usability issues.
These new recommendations currently cover the following areas:
Configure the viewport: Without a meta-viewport tag, modern mobile browsers will assume your page is not mobile-friendly, and will fall back to a desktop viewport and possibly apply font-boosting, interfering with your intended page layout. Configuring the viewport to width=device-width should be your first step in mobilizing your site.
Size content to the viewport: Users expect mobile sites to scroll vertically, not horizontally. Once you’ve configured your viewport, make sure your page content fits the width of that viewport, keeping in mind that not all mobile devices are the same width.
Use legible font sizes: If users have to zoom in just to be able read your article text on their smartphone screen, then your site isn’t mobile-friendly. PageSpeed Insights checks that your site’s text is large enough for most users to read comfortably.
Size tap targets appropriately: Nothing’s more frustrating than trying to tap a button or link on a phone or tablet touchscreen, and accidentally hitting the wrong one because your finger pad is much bigger than a desktop mouse cursor. Make sure that your mobile site’s touchscreen tap targets are large enough to press easily.
Avoid plugins: Most smartphones don’t support Flash or other browser plugins, so make sure your mobile site doesn't rely on plugins.
These rules are described in more detail in our help pages. When you’re ready, you can test your pages and the improvements you make using the PageSpeed Insights tool. We’ve also updated PageSpeed Insights to use a mobile friendly design, and we’ve translated our documents into additional languages.
As always, if you have any questions or feedback, please post in our discussion group.
Posted by Matthew Steele and Doantam Phan, PageSpeed Insights team
We've been closely watching performance and listening to webmaster feedback. Since Instant Pages rolled out we've saved more than a thousand years of ours users' time. We're very happy with the results so far, and we'll be gradually increasing how often we trigger the feature.
In the vast majority of cases, webmasters don't have to do anything for their sites to work correctly with prerendering. As we mentioned in our initial announcement of Instant Pages, search traffic will be measured in Webmaster Tools just like before this feature: only results the user visits will be counted. If your site keeps track of pageviews on its own, you might be interested in the Page Visibility API, which allows you to detect when prerendering is occurring and factor those out of your statistics. If you use an ads or analytics package, check with them to see if their solution is already prerender-aware; if it is, in many cases you won't need to make any changes at all. If you're interested in triggering Chrome's prerendering within your own site, see the Prerendering in Chrome article.
Instant Pages means that users arrive at your site happier and more engaged, which is great for everyone.
Posted by Ziga Mahkovec - Software Engineer, Instant Pages
Just curious about the Q&A? No problem! Here you go:
Is it possible to check my server response time from different areas around the world?
Yes. WebPagetest.org can test performance from the United States (both East and West Coast—go West Coast! :), United Kingdom, China, and New Zealand.
What's a good response time to aim for?
First, if your competition is fast, they may provide a better user experience than your site for your same audience. In that case, you may want to make your site better, stronger, faster...
Otherwise, studies by Akamai claim 2 seconds as the threshold for ecommerce site "acceptability." Just as an FYI, at Google we aim for under a half-second.
Does progressive rendering help users?
Definitely! Progressive rendering is when a browser can display content as it’s available incrementally rather than waiting for all the content to display at once. This provides users faster visual feedback and helps them feel more in control. Bingexperimented with progressive rendering by sending users their visual header (like the logo and searchbox) quickly, then the results/ads once they were available. Bing found a 0.7% increase in satisfaction with progressive rendering. They commented that this improvement compared with full feature rollout.
How can you implement progressive rendering techniques on your site? Put stylesheets at the top of the page. This allows a browser to start displaying content ASAP.
Page speed plugin, videos, articles, and help forum are all found at code.google.com/speed/.
Written by Maile Ohye, Developer Programs Tech Lead
While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point. We launched this change a few weeks back after rigorous testing. If you haven't seen much change to your site rankings, then this site speed change possibly did not impact your site.
We encourage you to start looking at your site's speed (the tools above provide a great starting point) — not only to improve your ranking in search engines, but also to improve everyone's experience on the Internet.
Posted by Amit Singhal, Google Fellow and Matt Cutts, Principal Engineer, Google Search Quality Team