Google has a broad range of resources to help you better understand your website and improve its performance. This Webmaster Central Blog, the Help Center, the Webmaster forum, and the recently released Search Engine Optimization (SEO) Starter Guide are just a few.
Testing a page is easy: just open the testing tool, enter a URL, and review the output. If there are issues, the tool will highlight the invalid code in the page source. If you're working with others on this page, the share-icon on the bottom-right lets you do that quickly. You can also use preview button to view all the different rich results the page is eligible for. And … once you're happy with the result, use Submit To Google to fetch & index this page for search.
Want to get started with rich snippets rich results? Check out our guides for marking up your content. Feel free to drop by our Webmaster Help forums should you have any questions or get stuck; the awesome experts there can often help resolve issues and give you tips in no time!
The cloaked keywords and link hack automatically creates many pages with
nonsensical sentences, links, and images. These pages sometimes contain basic
template elements from the original site, so at first glance, the pages might
look like normal parts of the target site until you read the content. In this
type of attack, hackers usually use cloaking techniques to hide the malicious
content and make the injected page appear as part of the original site or a 404
error page.
The gibberish hack automatically creates many pages with nonsensical sentences
filled with keywords on the target site. Hackers do this so the hacked pages
show up in Google Search. Then, when people try to visit these pages, they'll be
redirected to an unrelated page, like a porn site for example.
The Japanese keywords hack typically creates new pages with Japanese text on
the target site in randomly generated directory names. These pages are monetized
using affiliate links to stores selling fake brand merchandise and then shown in
Google search. Sometimes the accounts of the hackers get added in Search Console
as site owners.
Lastly, after you clean your site and fix the problem, make sure to file for a
reconsideration request to have our teams review your site.
Be mindful of your sources! Be very careful of a free premium
theme/plugin!
You probably have heard about free premium plugins! If you've ever stumbled upon a site offering you plugins you normally have to purchase for free, be very careful. Many hackers lure you in by copying a popular plugin and then add backdoors or malware that will allow them to access your site. Read more about a similar case on the Sucuri blog. Additionally, even legit good quality plugins and themes can become dangerous if :
you do not update them as soon as a new version becomes available
the developer of said theme or plugin does not update them, and they become old over time
In any case, keeping all your site's software modern and updated is
essential in keeping hackers out of your website.
Botnet in wordpress
A botnet
is a cluster of machines, devices, or websites under the control of a third
party often used to commit malicious acts, such as operating spam campaigns,
clickbots, or DDoS. It's difficult to detect if your site has been infected by a
botnet because there are often no specific changes to your site. However, your
site's reputation, resources, and data are at risk if your site is in a botnet.
Learn more about botnets, how to detect them, and how they can affect your site
at Botnet in wordpress and joomla article.
As usual if you have any questions post on our Webmaster
Help Forums for help from the friendly community and see you next week!
Why do sites get hacked? Hackers have
different motives for compromising a website, and hack attacks can be very
different, so they are not always easily detected. Here are some tips which will
help you in detecting hacked sites!
Getting started:
Start with our guide "How do I know if my
site is hacked?" if you've received a security alert from Google or another
party. This
guide will walk you through basic steps to check for any signs of
compromises on your site.
Understand the alert on Google Search:
At Google, we have
different processes to deal with hacking scenarios. Scanning tools will often
detect malware, but they can miss some spamming hacks. A clean verdict from Safe
Browsing does not mean that you haven't been hacked to distribute spam.
If you ever see "This site may be
hacked", your site may have been hacked to display spam. Essentially, your
site has been hijacked to serve some free advertising.
If you see
"This site may harm your computer" beneath the site URL then we think the
site you're about to visit might allow programs to install malicious software on
your computer.
If you see a big red screen before your site, that can mean a variety of things:
If you see "The site ahead contains malware", Google has detected that
your site distributes malware.
If you see "The site ahead contains harmful programs", then the site has
been flagged for distributing unwanted
software.
"Deceptive site ahead" warnings indicate that your site may be serving phishing or
social engineering. Your site could have been hacked to do any of these
things.
Malvertising vs Hack:
Malvertising happens when your site
loads a bad ad. It may make it seem as though your site has been hacked, perhaps
by redirecting your visitors, but in fact is just an ad behaving badly.
Open redirects: check if your site is enabling open redirects
Hackers might want to take advantage of a good site to mask their
URLs. One way they do this is by using open redirects, which allow them to use
your site to redirect users to any URL of their choice. You can read
more here!
Mobile check: make sure to view your site from a mobile browser in
incognito mode. Check for bad mobile ad networks.
Sometimes bad
content like ads or other third-party elements unknowingly
redirect mobile users. This behavior can easily escape detection because
it's only visible from certain browsers. Be sure to check that the mobile and
desktop versions of your site show the same content.
Use Search Console and get message:
Search Console is a
tool that Google uses to communicate with you about your website. It also
includes many other tools that can help you improve and manage your website.
Make sure you have your site verified in
Search Console even if you aren't a primary developer on your site. The
alerts and messages in Search Console will let you know if Google has detected
any critical errors on your site.
If you're still unable to find any signs of a hack, ask a security expert or
post on our
Webmaster Help Forums for a second look.
The #NoHacked campaign will run for the next 3 weeks. Follow us on our G+ and Twitter channels or look out for the
content in this blog as we will be posting summary for each week right here at
the beginning of each week! Stay safe meanwhile!
Configure 301 redirects on the old mobile URLs to point to the responsive
versions (the new pages). These redirects need to be done on a per-URL basis,
individually from each mobile URLs to the responsive URLs.
Remove any mobile-URL specific configuration your site might have, such as
conditional redirects or a vary HTTP header.
As a good practice, setup
rel=canonical on the responsive URLs pointing to themselves
(self-referential canonicals).
If you're currently using dynamic serving and want to move to responsive design,
you don't need to add or change any redirects.
Some benefits for moving to responsive web design
Moving to a responsive site should make maintenance and reporting much easier
for you down the road. Aside from no longer needing to manage separate URLs for
all pages, it will also make it much easier to adopt practices and technologies
such as hreflang for internationalization, AMP for speed, structured data for
advanced search features and more.
As always, if you need more help you can ask a question in our webmaster forum.
We will be sharing webmaster-related updates in our current and new blogs to
make sure you have a place to follow the latest launches, updates and changes in
Search in your languages! We will share links to relevant Help resources,
educational content and events as they become available.
Just a reminder, here are some of the resources that we have available in
multiple languages:
Developer documentation on
Search - a great resource where you can find feature guides, code labs,
videos and links to more useful tools for webmasters.
If you have webmaster-specific questions, check our event calendar for the
next hangout session or live event! Alternatively, you can post your questions
to one of the local help forum, where our talented Product Experts from the TC program will try to answer
your questions. Our Experts are product enthusiasts who have earned the
distinction of "Top Contributor," or "Rising Star," by sharing their knowledge
on the Google Help Forums.
If you have suggestions, please let us know in the comments below. We look
forward to working with you in your language!
In the next few weeks, we're releasing two exciting BETA features from the new Search Console to a small set of users — Index Coverage report and AMP fixing flow.
The new Index Coverage report shows the count of indexed pages, information about why some pages could not be indexed, along with example pages and tips on how to fix indexing issues. It also enables a simple sitemap submission flow, and the capability to filter all Index Coverage data to any of the submitted sitemaps.
Here’s a peek of our new Index Coverage report:
The new AMP fixing flow
The new AMP fixing experience starts with the AMP Issues report. This report shows the current AMP issues affecting your site, grouped by the underlying error. Drill down into an issue to get more details, including sample affected pages. After you fix the underlying issue, click a button to verify your fix, and have Google recrawl the pages affected by that issue. Google will notify you of the progress of the recrawl, and will update the report as your fixes are validated.
As we start to experiment with these new features, some users will be introduced to the new redesign through the coming weeks. Posted by John Mueller and the Search Console Team
If you have images on your site, you can help users identify the type of content associated with the image by using appropriate structured data on your pages. This helps users find relevant content quickly, and sends better targeted traffic to your site. If you're publishing recipes, add Recipe markup on your page, for products, add Product markup, and for videos, add Video markup. Our algorithms will automatically badge GIFs, without the need of any markup. While we can't guarantee that badges will always be shown, adding the recommended structured data fields in addition to the required fields may increase the chance of adding a badge to your image search results. You can use the Structured Data Testing Tool to verify that your pages are free of errors, and therefore eligible for the new Image Search badges. In addition, the Rich Cards report in Search Console can provide aggregate stats on your markup. If you have questions about the feature, please ask us in the Webmaster Help Forum. Posted by Assaf Broitman, Image Search team
For employers or site owners with job content, this feature brings many benefits:
Prominent place in Search results: your postings are eligible to be displayed in the in the new job search feature on Google, featuring your logo, reviews, ratings, and job details.
More, motivated applicants: job seekers can filter by various criteria like location or job title, meaning you’re more likely to get applicants who are looking exactly for that job.
Increased chances of discovery and conversion: job seekers will have a new avenue to interact with your postings and click through to your site.
Submit a sitemap (or an RSS or Atom feed) with a <lastmod> date for each listing.
If you have more than 100,000 job postings or more than 10,000 changes per day, you can express interest to use the High Change Rate feature.
If you already publish your job openings on another site like LinkedIn, Monster, DirectEmployers, CareerBuilder, Glassdoor, and Facebook, they are eligible to appear in the feature as well.
Job search is an enriched search experience. We’ve created a dedicated guide to help you understand how Google ranking works for enriched search and practices for improving your presence
Keep track of how you’re doing and fix issues
There’s a suite of tools to help you with the implementation:
In the coming weeks, we’ll add new job listings filters in the Search Analytics report in Search Console, so you can track clicks and impressions for your listings.
As always, if you have questions, ask in the forums or find us on Twitter!
Search result snippets are much the same; they help people decide whether or not it makes sense to invest the time reading the page the snippet belongs to.
The more descriptive and relevant a search result snippet is, the more likely that people will click through and be satisfied with the page they land on. Historically, snippets came from 3 places:
The content of the page
The meta description
DMOZ listings
The content of the page is an obvious choice for result snippets, and the content that can be extracted is often the most relevant to people’s queries. However, there are times when the content itself isn't the best source for a snippet. For instance, when someone searches for a publishing company for their book, the relevant homepages in the result set may contain only a few images describing the businesses and a logo, and maybe some links, none of which are particularly useful for a snippet.
The logical fallback in cases when the content of a page doesn't have much textual content for a search result snippet is the meta description. This should be short blurbs that describe accurately and precisely the content in a few words.
Finally, when a page doesn't have much textual content for snippet generation and the meta description is missing, unrelated to the page, or low quality, our fallback was DMOZ, also known as The Open Directory Project. For over 10 years, we relied on DMOZ for snippets because the quality of the DMOZ snippets were often much higher quality than those provided by webmasters in their meta description, or were more descriptive than what the page provided.
With DMOZ now closed, we've stopped using its listings for snippeting, so it's a lot more important that webmasters provide good meta descriptions, if adding more content to the page is not an option.
What makes a good meta description?
Good meta descriptions are short blurbs that describe accurately the content of the page. They are like a pitch that convince the user that the page is exactly what they're looking for. For more tips, we have a handy help center article on the topic. Remember to make sure that both your desktop and your mobile pages include both a title and a meta description.
What are the most common problems with meta descriptions?
Because meta descriptions are usually visible only to search engines and other software, webmasters sometimes forget about them, leaving them completely empty. It's also common, for the same reason, that the same meta description is used across multiple (and sometimes many) pages. On the flip side, it's also relatively common that the description is completely off-topic, low quality, or outright spammy. These issues tarnish our users' search experience, so we prefer to ignore such meta descriptions.
Is there a character limit for meta descriptions?
There's no limit on how long a meta description can be, but the search result snippets are truncated as needed, typically to fit the device width.
With DMOZ (ODP) closed, we stopped relying on its data and thus the NOODP directive is already no-op.
Can I prevent Google from using the page contents as snippet?
You can prevent Google from generating snippets altogether by specifying the "nosnippet" robots directive. There's no way to prevent using page contents as snippet while allowing other sources.
As always, if you have questions, ask in the forums or find us on Twitter!
Google does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. However, what does violate Google's guidelines on link schemes is when the main intent is to build links in a large-scale way back to the author’s site. Below are factors that, when taken to an extreme, can indicate when an article is in violation of these guidelines:
Stuffing keyword-rich links to your site in your articles
Having the articles published across many different sites; alternatively, having a large number of articles on a few large, different sites
Using or hiring article writers that aren’t knowledgeable about the topics they’re writing on
Using the same or similar content across these articles; alternatively, duplicating the full content of articles found on your own site (in which case use of rel=”canonical”, in addition to rel=”nofollow”, is advised)
When Google detects that a website is publishing articles that contain spammy links, this may change Google's perception of the quality of the site and could affect its ranking. Sites accepting and publishing such articles should carefully vet them, asking questions like: Do I know this person? Does this person’s message fit with my site’s audience? Does the article contain useful content? If there are links of questionable intent in the article, has the author used rel=”nofollow” on them?
For websites creating articles made for links, Google takes action on this behavior because it’s bad for the Web as a whole. When link building comes first, the quality of the articles can suffer and create a bad experience for users. Also, webmasters generally prefer not to receive aggressive or repeated "Post my article!" requests, and we encourage such cases to be reported to our spam report form. And lastly, if a link is a form of endorsement, and you’re the one creating most of the endorsements for your own site, is this putting forth the best impression of your site? Our best advice in relation to link building is to focus on improving your site’s content and everything--including links--will follow (no pun intended).