Since initially announcing property sets earlier this year, one of the most popular requests has been to expand this functionality to more sections of Search Console. Thanks to your feedback, we're now expanding property sets to more features!

Property sets help to show how your business is seen by Google across separate websites or apps. For example, if you have multiple international or brand-specific websites, and perhaps even an Android app, it can be useful to see changes of the whole set over time: are things headed in the expected direction? are there any outliers that you'd want to drill down into? Similarly, you could monitor your site's hreflang setup across different versions of the same website during a planned transition, such as when you move from HTTP to HTTPS, or change domains.

With Search Console's property sets, you can now just add any verified properties to a set, let the data collect, and then check out features like the mobile usability report, review your AMP implementation, double-check rich cards, or hreflang / internationalization markup, and more.

We hope these changes make it easier to understand your properties in Search Console. Should you have any questions - or if you just want to help others, feel free to drop by our Webmaster Help Forums.
John Mueller, Webmaster Trends Analyst, Google Switzerland

Since initially announcing property sets earlier this year, one of the most popular requests has been to expand this functionality to more sections of Search Console. Thanks to your feedback, we're now expanding property sets to more features!

Property sets help to show how your business is seen by Google across separate websites or apps. For example, if you have multiple international or brand-specific websites, and perhaps even an Android app, it can be useful to see changes of the whole set over time: are things headed in the expected direction? are there any outliers that you'd want to drill down into? Similarly, you could monitor your site's hreflang setup across different versions of the same website during a planned transition, such as when you move from HTTP to HTTPS, or change domains.

With Search Console's property sets, you can now just add any verified properties to a set, let the data collect, and then check out features like the mobile usability report, review your AMP implementation, double-check rich cards, or hreflang / internationalization markup, and more.


We hope these changes make it easier to understand your properties in Search Console. Should you have any questions - or if you just want to help others, feel free to drop by our Webmaster Help Forums.


Avoid:

box

Don’t:

box


-->
Best Practice:

Use server-side or hybrid rendering so users receive the content in the initial payload of their web request.

Always ensure your URLs are independently accessible:

https://www.example.com/product/25/

The above should deep link to that particular resource.

If you can’t support server-side or hybrid rendering for your Progressive Web App and you decide to use client-side rendering, we recommend using the Google Search Console “Fetch as Google tool” to verify your content successfully renders for our search crawler.

Don’t:

Don't redirect users accessing deep links back to your web app's homepage.

Additionally, serving an error page to users instead of deep linking should also be avoided.


Provide Clean URLs

Why? Fragment identifiers (#user/24601/ or #!user/24601/) were an effective workaround for browsers to AJAX new content from a server without reloading the page. This design is known as client-side rendering.

However, the fragment identifier syntax isn’t compatible with some web tools, frameworks and protocols such as Facebook’s Open Graph protocol.

The History API enables us to update the URL without fragment identifiers while still fetching resources asynchronously and therefore avoiding page reloads -- it’s the best of both worlds. The AJAX crawling scheme (with its #! / escaped-fragment URLs) made sense at its time, but is now no longer recommended.

Our hybrid PWA and client-side PWA samples demonstrate the History API.

Best Practice:

Provide clean URLs without fragment identifiers (# or #!) such as:

https://www.example.com/product/25/

If using client-side or hybrid rendering be sure to support browser navigation with the History API.

Avoid:

Using the #! URL structure to drive unique URLs is discouraged:

https://www.example.com/#!product/25/

It was introduced as a workaround before the advent of the History API. It is considered a separate pattern to the purely # URL structure.

Don’t:

Using the # URL structure without the accompanying ! symbol is unsupported:

https://www.example.com/#product/25/

This URL structure is already a concept in the web and relates to deep linking into content on a particular page.


Specify Canonical URLs

Why? The best way to eliminate confusion for indexing when the same content is available under multiple URLs (be it the same or different domains) is to mark one page as the canonical, and all other pages that duplicate that content to refer to it.

Best Practice:

Include the following tag across all pages mirroring a particular piece of content:

<link rel="canonical" href="https://www.example.com/your-url/" />

If you are supporting Accelerated Mobile Pages be sure to correctly use its counterpart rel=”amphtml” instruction as well.

Avoid:

Avoid purposely duplicating content across multiple URLs and not using the rel="canonical" link element.

For example, the rel="canonical" link element can reduce ambiguity for URLs with tracking parameters.

Don’t:

Avoid creating conflicting canonical references between your pages.


Design for Multiple Devices

Why? It’s important that all your users get the best experience possible when viewing your website, regardless of their device.

Make your site responsive in its design -- fonts, margins, paddings, buttons and general design of your site should scale dynamically based on screen resolutions and device viewports.

Small images scaled up for desktop or tablet devices give a poor experience. Conversely, super high resolution images take a long time to download on mobile phones and may impact mobile scroll performance.

Read more UX for PWAs here.

Best Practice:

Use “srcset” attribute to fetch different resolution images for different density screens to avoid downloading images larger than the device’s screen is capable of displaying.

Scale your font size and line height to ensure your text is legible no matter the size of the device. Similarly ensure the padding and margins of elements also scale sensibly.

Test various screen resolutions using the Chrome Developer Tool’s Device Mode feature and Mobile Friendly Test tool.

Don’t:

Don't show different content to users than you show to Google. If you use redirects or user agent detection (a.k.a. browser sniffing or dynamic serving) to alter the design of your site for different devices it’s important that the content itself remains the same.

Use the Search Console “Fetch as Google” tool to verify the content fetched by Google matches the content a user sees.

For usability reasons, avoid using fixed-size fonts.


Develop Iteratively

Why? One of the safest paths to take when adding features to a web application is to make changes iteratively. If you add features one at a time you can observe the impact of each individual change.

Alternatively many developers prefer to view their progressive web application as an opportunity to overhaul their mobile site in one fell swoop -- developing the new web app in an isolated environment and swapping it with their existing mobile site once ready.

When developing features iteratively try to break the changes into separate pieces. For example, if you intend to move from server-side rendering to hybrid rendering then tackle that as a single iteration -- rather than in combination with other features.

Both approaches have their own pros and cons. Iterating reduces the complexity of dealing with search indexability as the transition is continuous. However, iterating might result in a slower development process and potentially a less innovative overhaul if development is not starting from scratch.

In either case, the most sensitive areas to keep an eye on are your canonical URLs and your site’s robots.txt configuration.

Best Practice:

Iterate on your website incrementally by adding new features piece by piece.

For example, if don’t support HTTPS yet then start by migrating to a secure site.

Avoid:

If you’ve developed your progressive web app in an isolated environment, then avoid launching it without checking the rel-canonical links and robots.txt are setup appropriately.

Ensure your rel-canonical links point to the real site and that your robots.txt configuration allows crawlers to crawl your new site.

Don’t:

It’s logical to prevent crawlers from indexing your in-development site before launch but don’t forget to unblock crawlers from accessing your new site when you launch.


Use Progressive Enhancement

Why? Wherever possible it’s important to detect browser features before using them. Feature detection is also better than testing for browsers that you believe support a given feature.

A common bad practice in the past was to enable or disable features by testing which browser the user had. However, as browsers are constantly evolving with features this technique is strongly discouraged.

Service Worker is a relatively new technology and it’s important to not break compatibility in the pursuit of progress -- it's a perfect example of when to use progressive enhancement.

Best Practice:

Before registering a Service Worker check for the availability of its API:

if ('serviceWorker' in navigator) {
... 

Use per API detection method for all your website’s features.

Don’t:

Never use the browser’s user agent to enable or disable features in your web app. Always check whether the feature’s API is available and gracefully degrade if unavailable.

Avoid updating or launching your site without testing across multiple browsers! Check your site analytics to learn which browsers are most popular among your user base.


Test with Search Console

Why? It’s important to understand how Google Search views your site’s content. You can use Search Console to fetch individual URLs from your site and see how Google Search views them using the “Crawl > Fetch as Google“ feature. Search Console will process your JavaScript and render the page when that option is selected; otherwise only the raw HTML response is shown

Google Search Console also analyses the content on your page in a variety of ways including detecting the presence of Structured Data, Rich Cards, Sitelinks & Accelerated Mobile Pages.

Best Practice:

Monitor your site using Search Console and explore its features including “Fetch as Google”.

Provide a Sitemap via Search Console “Crawl > Sitemaps” It can be an effective way to ensure Google Search is aware of all your site’s pages.


Annotate with Schema.org structured data

Why? Schema.org structured data is a flexible vocabulary for summarizing the most important parts of your page as machine-processable data. This can be as general as simply saying that a page is a NewsArticle, or as specific as detailing the location, band name, venue and ticket vendor for a touring band, or summarizing the ingredients and steps for a recipe.

The use of this metadata may not make sense for every page on your web application but it’s recommended where it’s sensible. Google extracts it after the page is rendered.

There are a variety of data types including “NewsArticle”, “Recipe” & “Product” to name a few. Explore all the supported data types here.

Best Practice:

Verify that your Schema.org meta data is correct using Google’s Structured Data Testing Tool.

Check that the data you provided is appearing and there are no errors present.

Don’t:

Avoid using a data type that doesn’t match your page’s actual content. For example don’t use “Recipe” for a T-Shirt you’re selling -- use “Product” instead.


Annotate with Open Graph & Twitter Cards

Why? In addition to the Schema.org metadata it can be helpful to add support for Facebook’s Open Graph protocol and Twitter rich cards as well.

These metadata formats improve the user experience when your content is shared on their corresponding social networks.

If your existing site or web application utilises these formats it’s important to ensure they are included in your progressive web application as well for optimal virality.

Best Practice:

Test your Open Graph markup with the Facebook Object Debugger Tool.

Familiarise yourself with Twitter’s metadata format.

Don’t:

Don’t forget to include these formats if your existing site supports them.


Test with Multiple Browsers

Why? Clearly from a user perspective it’s important that a website behaviors the same across all browsers. While the experience might adapt for different screen sizes we all expect a mobile site to work the same on similarly sized devices whether it’s an iPhone or an Android mobile phone.

While the web can be perceived as fragmented due to number of browsers in use around the world, this variety and competition is part of what makes the web such an innovative platform. Thankfully, web standards have never been more mature than they are now and modern tools enable developers to build rich, cross browser compatible websites with confidence.

Best Practice:

Use cross browser testing tools such as BrowserStack.com, Browserling.com or BrowserShots.org to ensure your PWA is cross browser compatible.


Measure Page Load Performance

Why? The faster a website loads for a user the better their user experience will be. Optimizing for page speed is already a well known focus in web development but sometimes when developing a new version of a site the necessary optimizations are not considered a high priority.

When developing a progressive web application we recommend measuring the performance of your page load speed and optimizing before launching the site for the best results.

Best Practice:

Use tools such as Page Speed Insights and Web Page Test to measure the page load performance of your site. While Googlebot has a bit more patience in rendering, research has shown that 40% of consumers will leave a page that takes longer than three seconds to load..

Read more about our web page performance recommendations and the critical rendering path here.

Don’t:

Avoid leaving optimization as a post-launch step. If your website’s content loads quickly before migrating to a new progressive web application then it’s important to not regress in your optimizations.


We hope that the above checklist is useful and provides the right guidance to help you develop your Progressive Web Applications with indexability in mind.

As you get started, be sure to check out our Progressive Web App indexability samples that demonstrate server-side, client-side and hybrid rendering. As always, if you have any questions, please reach out on our Webmaster Forums.


Percentage of pages loaded over HTTPS in Chrome



As the remainder of the web transitions to HTTPS, we’ll continue working to ensure that migrating to HTTPS is a no-brainer, providing business benefit beyond increased security. HTTPS currently enables the best performance the web offers and powerful features that benefit site conversions, including both new features such as service workers for offline support and web push notifications, and existing features such as credit card autofill and the HTML5 geolocation API that are too powerful to be used over non-secure HTTP. As with all major site migrations, there are certain steps webmasters should take to ensure that search ranking transitions are smooth when moving to HTTPS. To help with this, we’ve posted two FAQs to help sites transition correctly, and will continue to improve our web fundamentals guidance.


We’ve seen many sites successfully transition with negligible effect on their search ranking and traffic. Brian Wood, Director of Marketing SEO at Wayfair, a large retail site, commented: “We were able to migrate Wayfair.com to HTTPS with no meaningful impact to Google rankings or Google organic search traffic. We are very pleased to say that all Wayfair sites are now fully HTTPS.” CNET, a large tech news site, had a similar experience: “We successfully completed our move of CNET.com to HTTPS last month,” said John Sherwood, Vice President of Engineering & Technology at CNET. “Since then, there has been no change in our Google rankings or Google organic search traffic.”


Webmasters that include ads on their sites also should carefully monitor ad performance and revenue during large site migrations. The portion of Google ad traffic served over HTTPS has increased dramatically over the past 3 years. All ads that come from any Google source always support HTTPS, including AdWords, AdSense, or DoubleClick Ad Exchange; ads sold directly, such as those through DoubleClick for Publishers, still need to be designed to be HTTPS-friendly. This means there will be no change to the Google-sourced ads that appear on a site after migrating to HTTPS. Many publishing partners have seen this in practice after a successful HTTPS transition. Jason Tollestrup, Director of Programmatic Advertising for the Washington Post, “saw no material impact to AdX revenue with the transition to SSL.”


As migrating to HTTPS becomes even easier, we’ll continue working towards a web that’s secure by default. Don’t hesitate to start planning your HTTPS migration today!


Share on Twitter Share on Facebook

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook

Critic reviews
In the U.S. on mobile and desktop, qualifying publishers can participate in the critic review feature in local Knowledge Panels. Critic reviews possess an editorial tone of voice and have an opinionated position on the local business, coming from an editor or on-the-ground expert. For more information on how to participate, see the details in our critic reviews page.
The local information across Google Search helps millions of people, every day, discover and share great places. If you have any questions, please visit our webmaster forums.

Share on Twitter Share on Facebook


The updated information provides more specific explanations of six different security issues detected by Safe Browsing, including malware, deceptive pages, harmful downloads, and uncommon downloads. These explanations give webmasters more context and detail about what Safe Browsing found. We also offer tailored recommendations for each type of issue, including sample URLs that webmasters can check to identify the source of the issue, as well as specific remediation actions webmasters can take to resolve the issue.

We on the Safe Browsing team definitely recommend registering your site in Search Console even if it is not currently experiencing a security issue. We send notifications through Search Console so webmasters can address any issues that appear as quickly as possible.

Our goal is to help webmasters provide a safe and secure browsing experience for their users. We welcome any questions or feedback about the new features on the Google Webmaster Help Forum, where Top Contributors and Google employees are available to help.

For more information about Safe Browsing’s ongoing work to shine light on the state of web security and encourage safer web security practices, check out our summary of trends and findings on the Safe Browsing Transparency Report. If you’re interested in the tools Google provides for webmasters and developers dealing with hacked sites, this video provides a great overview.
Share on Twitter Share on Facebook


An example of an intrusive standalone interstitial

Another example of an intrusive standalone interstitial

 

By contrast, here are some examples of techniques that, used responsibly, would not be affected by the new signal:

Examples of interstitials that would not be affected by the new signal, if used responsibly


An example of an interstitial for cookie usage

An example of an interstitial for age verification

An example of a banner that uses a reasonable amount of screen space

 

We previously explored a signal that checked for interstitials that ask a user to install a mobile app. As we continued our development efforts, we saw the need to broaden our focus to interstitials more generally. Accordingly, to avoid duplication in our signals, we've removed the check for app-install interstitials from the mobile-friendly test and have incorporated it into this new signal in Search.

Remember, this new signal is just one of hundreds of signals that are used in ranking. The intent of the search query is still a very strong signal, so a page may still rank highly if it has great, relevant content. As always, if you have any questions or feedback, please visit our webmaster forums.


Share on Twitter Share on Facebook

Publishers with critic reviews for local entities can get up and running by selecting snippets of reviews from their sites and annotating them and the associated business with schema.org markup. This process, detailed in our critic reviews markup instructions, allows publishers to communicate to Google which snippet they prefer, what URL is associated with the review and other metadata about the local business that allows us to ensure that we’re showing the right review for the right entity.

Google can understand a variety of markup formats, including the JSON-LD data format, which makes it easier than ever to incorporate structured data about reviews into webpages! Publishers can get started here. And as always, if you have any questions, please visit our webmaster forums.

Share on Twitter Share on Facebook