Google has a broad range of resources to help you better understand your website and improve its performance. This Webmaster Central Blog, the Help Center, the Webmaster forum, and the recently released Search Engine Optimization (SEO) Starter Guide are just a few.

We also have a YouTube channel, for answers to your questions in video format. To help with short & to the point answers to specific questions, we've just launched a new series, which we call SEO Snippets.

In this series of short videos, the Google team will be answering some of the webmaster and SEO questions that we regularly see on the Webmaster Central Help Forum. From 404 errors, how and when crawling works, a site's URL structure, to duplicate content, we'll have something here for you.

Check out the links shared in the videos to get more helpful webmaster information, drop by our help forum and subscribe to our YouTube channel for more tips and insights!


Testing a page is easy: just open the testing tool, enter a URL, and review the output. If there are issues, the tool will highlight the invalid code in the page source. If you're working with others on this page, the share-icon on the bottom-right lets you do that quickly. You can also use preview button to view all the different rich results the page is eligible for. And … once you're happy with the result, use Submit To Google to fetch & index this page for search.

Want to get started with rich snippets rich results? Check out our guides for marking up your content. Feel free to drop by our Webmaster Help forums should you have any questions or get stuck; the awesome experts there can often help resolve issues and give you tips in no time!


The cloaked keywords and link hack automatically creates many pages with nonsensical sentences, links, and images. These pages sometimes contain basic template elements from the original site, so at first glance, the pages might look like normal parts of the target site until you read the content. In this type of attack, hackers usually use cloaking techniques to hide the malicious content and make the injected page appear as part of the original site or a 404 error page.
  • Fixing the Gibberish Hack
    The gibberish hack automatically creates many pages with nonsensical sentences filled with keywords on the target site. Hackers do this so the hacked pages show up in Google Search. Then, when people try to visit these pages, they'll be redirected to an unrelated page, like a porn site for example.
  • Fixing the Japanese Keywords Hack
    The Japanese keywords hack typically creates new pages with Japanese text on the target site in randomly generated directory names. These pages are monetized using affiliate links to stores selling fake brand merchandise and then shown in Google search. Sometimes the accounts of the hackers get added in Search Console as site owners.
  • Lastly, after you clean your site and fix the problem, make sure to file for a reconsideration request to have our teams review your site.

    If you have any questions, post your questions on our Webmaster Help Forums!

    Share on Twitter Share on Facebook


    Understanding how your site was compromised is an important part of protecting your site from attacks, here some top ways that sites get compromised by spammers.
    In any case, keeping all your site's software modern and updated is essential in keeping hackers out of your website.

    As usual if you have any questions post on our Webmaster Help Forums for help from the friendly community and see you next week!
    Share on Twitter Share on Facebook

    Share on Twitter Share on Facebook

    Why do sites get hacked? Hackers have different motives for compromising a website, and hack attacks can be very different, so they are not always easily detected. Here are some tips which will help you in detecting hacked sites!

    If you're still unable to find any signs of a hack, ask a security expert or post on our Webmaster Help Forums for a second look.

    The #NoHacked campaign will run for the next 3 weeks. Follow us on our G+ and Twitter channels or look out for the content in this blog as we will be posting summary for each week right here at the beginning of each week! Stay safe meanwhile!

    Share on Twitter Share on Facebook

    Share on Twitter Share on Facebook

    Share on Twitter Share on Facebook

    Share on Twitter Share on Facebook

    Share on Twitter Share on Facebook

  • Configure 301 redirects on the old mobile URLs to point to the responsive versions (the new pages). These redirects need to be done on a per-URL basis, individually from each mobile URLs to the responsive URLs.
  • Remove any mobile-URL specific configuration your site might have, such as conditional redirects or a vary HTTP header.
  • As a good practice, setup rel=canonical on the responsive URLs pointing to themselves (self-referential canonicals).
  • If you're currently using dynamic serving and want to move to responsive design, you don't need to add or change any redirects.

    Some benefits for moving to responsive web design

    Moving to a responsive site should make maintenance and reporting much easier for you down the road. Aside from no longer needing to manage separate URLs for all pages, it will also make it much easier to adopt practices and technologies such as hreflang for internationalization, AMP for speed, structured data for advanced search features and more.

    As always, if you need more help you can ask a question in our webmaster forum.

    Share on Twitter Share on Facebook

    Join us in welcoming the latest additions to the Webmasters community:

    नमस्ते Webmasters in Hindi!

    Добро Пожаловать Webmasters in Russian!

    Hoşgeldiniz Webmasters in Turkish!

    สวัสดีค่ะ Webmasters in Thai!

    xin chào Webmasters in Vietnamese!

    We will be sharing webmaster-related updates in our current and new blogs to make sure you have a place to follow the latest launches, updates and changes in Search in your languages! We will share links to relevant Help resources, educational content and events as they become available.

    Just a reminder, here are some of the resources that we have available in multiple languages:

    Testing tools:

    Some other valuable resources (English-only):

    If you have webmaster-specific questions, check our event calendar for the next hangout session or live event! Alternatively, you can post your questions to one of the local help forum, where our talented Product Experts from the TC program will try to answer your questions. Our Experts are product enthusiasts who have earned the distinction of "Top Contributor," or "Rising Star," by sharing their knowledge on the Google Help Forums.

    If you have suggestions, please let us know in the comments below. We look forward to working with you in your language!

    Share on Twitter Share on Facebook

    In the next few weeks, we're releasing two exciting BETA features from the new Search Console to a small set of users — Index Coverage report and AMP fixing flow.

    The new Index Coverage report shows the count of indexed pages, information about why some pages could not be indexed, along with example pages and tips on how to fix indexing issues. It also enables a simple sitemap submission flow, and the capability to filter all Index Coverage data to any of the submitted sitemaps.
    Here’s a peek of our new Index Coverage report:

    The new AMP fixing flow

    The new AMP fixing experience starts with the AMP Issues report. This report shows the current AMP issues affecting your site, grouped by the underlying error. Drill down into an issue to get more details, including sample affected pages. After you fix the underlying issue, click a button to verify your fix, and have Google recrawl the pages affected by that issue. Google will notify you of the progress of the recrawl, and will update the report as your fixes are validated.
    As we start to experiment with these new features, some users will be introduced to the new redesign through the coming weeks.
    Share on Twitter Share on Facebook


    If you have images on your site, you can help users identify the type of content associated with the image by using appropriate structured data on your pages. This helps users find relevant content quickly, and sends better targeted traffic to your site.
    If you're publishing recipes, add Recipe markup on your page, for products, add Product markup, and for videos, add Video markup. Our algorithms will automatically badge GIFs, without the need of any markup. While we can't guarantee that badges will always be shown, adding the recommended structured data fields in addition to the required fields may increase the chance of adding a badge to your image search results.
    You can use the Structured Data Testing Tool to verify that your pages are free of errors, and therefore eligible for the new Image Search badges. In addition, the Rich Cards report in Search Console can provide aggregate stats on your markup.
    If you have questions about the feature, please ask us in the Webmaster Help Forum.
    Share on Twitter Share on Facebook

    For employers or site owners with job content, this feature brings many benefits:

    Get your job listings on Google

    Implementation involves two steps:
    1. Mark up your job listings with Job Posting structured data.
    2. Submit a sitemap (or an RSS or Atom feed) with a <lastmod> date for each listing.

    If you have more than 100,000 job postings or more than 10,000 changes per day, you can express interest to use the High Change Rate feature.
    If you already publish your job openings on another site like LinkedIn, Monster, DirectEmployers, CareerBuilder, Glassdoor, and Facebook, they are eligible to appear in the feature as well.
    Job search is an enriched search experience. We’ve created a dedicated guide to help you understand how Google ranking works for enriched search and practices for improving your presence

    Keep track of how you’re doing and fix issues

    There’s a suite of tools to help you with the implementation:

    In the coming weeks, we’ll add new job listings filters in the Search Analytics report in Search Console, so you can track clicks and impressions for your listings.
    As always, if you have questions, ask in the forums or find us on Twitter!

    Share on Twitter Share on Facebook

    Search result snippets are much the same; they help people decide whether or not it makes sense to invest the time reading the page the snippet belongs to.  
    The more descriptive and relevant a search result snippet is, the more likely that people will click through and be satisfied with the page they land on. Historically, snippets came from 3 places:
    1. The content of the page
    2. The meta description
    3. DMOZ listings
    The content of the page is an obvious choice for result snippets, and  the content that can be extracted is often the most relevant to people’s queries. However, there are times when the content itself isn't the best source for a snippet. For instance, when someone searches for a publishing company for their book, the relevant homepages in the result set may contain only a few images describing the businesses and a logo, and maybe some links, none of which are particularly useful for a snippet.
    The logical fallback in cases when the content of a page doesn't have much textual content for a search result snippet is the meta description. This should be short blurbs that describe accurately and precisely the content in a few words.
    Finally, when a page doesn't have much textual content for snippet generation and the meta description is missing, unrelated to the page, or low quality, our fallback was DMOZ, also known as The Open Directory Project. For over 10 years, we relied on DMOZ for snippets because the quality of the DMOZ snippets were often much higher quality than those  provided by webmasters in their meta description, or were more descriptive than what the page provided.
    With DMOZ now closed, we've stopped using its listings for snippeting, so it's a lot more important that webmasters provide good meta descriptions, if adding more content to the page is not an option.
    What makes a good meta description?
    Good meta descriptions are short blurbs that describe accurately the content of the page. They are like a pitch that convince the user that the page is exactly what they're looking for. For more tips, we have a handy help center article on the topic. Remember to make sure that both your desktop and your mobile pages include both a title and a meta description.
    What are the most common problems with meta descriptions?
    Because meta descriptions are usually visible only to search engines and other software, webmasters sometimes forget about them, leaving them completely empty. It's also common, for the same reason, that the same meta description is used across multiple (and sometimes many) pages. On the flip side, it's also relatively common that the description is completely off-topic, low quality, or outright spammy. These issues tarnish our users' search experience, so we prefer to ignore such meta descriptions.
    Is there a character limit for meta descriptions?
    There's no limit on how long a meta description can be, but the search result snippets are truncated as needed, typically to fit the device width.
    What will happen with the "NOODP" robots directive?
    With DMOZ (ODP) closed, we stopped relying on its data and thus the NOODP directive is already no-op.
    Can I prevent Google from using the page contents as snippet?
    You can prevent Google from generating snippets altogether by specifying the "nosnippet" robots directive. There's no way to prevent using page contents as snippet while allowing other sources.

    As always, if you have questions, ask in the forums or find us on Twitter!
    Posted by Gary, Search Team
    Share on Twitter Share on Facebook


    Google does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. However, what does violate Google's guidelines on link schemes is when the main intent is to build links in a large-scale way back to the author’s site. Below are factors that, when taken to an extreme, can indicate when an article is in violation of these guidelines:


    When Google detects that a website is publishing articles that contain spammy links, this may change Google's perception of the quality of the site and could affect its ranking. Sites accepting and publishing such articles should carefully vet them, asking questions like: Do I know this person? Does this person’s message fit with my site’s audience? Does the article contain useful content? If there are links of questionable intent in the article, has the author used rel=”nofollow” on them?


    For websites creating articles made for links, Google takes action on this behavior because it’s bad for the Web as a whole. When link building comes first, the quality of the articles can suffer and create a bad experience for users. Also, webmasters generally prefer not to receive aggressive or repeated "Post my article!" requests, and we encourage such cases to be reported to our spam report form. And lastly, if a link is a form of endorsement, and you’re the one creating most of the endorsements for your own site, is this putting forth the best impression of your site? Our best advice in relation to link building is to focus on improving your site’s content and everything--including links--will follow (no pun intended).

    Posted by the Google Webspam Team
    Share on Twitter Share on Facebook