Google Webmaster Tools is a free service from Google that was designed specifically for webmasters. It allows you to check the status of your websites on Google and optimise your websites for their search engine. Amongst other things, the service lets you submit sitemaps, define crawl rates, check incoming links, find broken links, and much more.
In this resource guide, I will show you how you can use Webmaster Tools to analyse your website and correct issues that might be reducing your search engine traffic.
The Webmaster Tools Home Page
You can login to Webmaster Tools at www.google.com/webmasters/tools/ using your Google account. The home page shows existing websites you have added to webmaster tools alphabetically or by site health. You can also display websites using a compact view.
At the top right hand side of the page you will see a help box that lists many useful articles and configuration options. Both features are available throughout the Webmaster Tools service and change according to the page you are viewing. For example, if you are browsing the Search Traffic section, you will see help articles relating to stats and data and Google Analytics.
In the configuration options box, there is an option to change your Webmaster Tools preferences. From here, you can choose to get email notifications every day about critical issues or all issues. This saves you from having to check Webmaster Tools frequently for any problems with your websites.
Once you are viewing one of your websites, the configuration box shows you several configuration options such as site settings and verification details.
Going back to the left hand side, we have links to all messages, labs and resources. When you are viewing one of your websites, the messages page will only display relevant messages for that website. On the home page, messages are shown for all websites you own. You will see a variety of messages including new errors that have been found, integration with Google Analytics and notices about not found errors.
Adding a Website to Webmaster Tools
Let’s backtrack a little and look at how you can add a website to Webmaster Tools. The “Add A Site” button is located at the top right hand side of the Webmaster Tools home page.
Clicking on the button will bring up a box for you to enter your website URL (minus the http:// part).
You then need to verify your website. The recommended method of doing this is to download the HTML file and upload it to the root of your domain.
Four alternative methods can be used to verify a website. I find adding a meta tag to my website’s header is one of the quickest methods. You can also sign in through your domain registrar or related Google account.
If you have failed to verify your website, a message will be displayed on the home page temporarily.
Your website will still be listed on your Webmaster Tools home page, however you will not be able to access any information for your website until it has been verified.
Once you have verified your website, you can then start analysing it.
When you click on one of your websites from the Webmaster Tools home page, you will be taken to the Site Dashboard. This area gives you a summary of your website including important site messages, crawl errors, search queries and the status of the content in your sitemap that has been indexed.
The site dashboard does give a good summary of what is happening with your website, however it is not a replacement for other pages. In my opinion, it is always better to spend a few minutes checking your website with all of the tools available on Webmaster Tools.
The search appearance section details how your website looks in Google’s search results. In the search appearance menu, you will see an information icon to the right hand side.
Clicking on the information icon brings up an overview of all the terminology that Google uses for search appearance including snippets, breadcrumbs and sitelinks. I recommend familiarising yourself with all of these terms so that you understand what Google is referring to.
Structured Data refers to the type of structured information that Google has found on your website. This includes microformats, microdata (shares, videos, events etc.), schema.org and RDFa (spans, divs etc.).
The tool will help you understand what type of information is located on your website and how Google has indexed it.
Data Highlighter allows you to highlight specific information on a page for Google. Nine data types are currently supported included: Articles, Book Reviews, Events, Local Businesses, Movies, Products, Restaurants, Software Applications and TV Episodes.
Highlighting information is simple. The first thing you need to do is inform Google of the URL and the data type you want to highlight.
You then highlight the data on the page and define it e.g. the title of an article or the date of an article.
The premise of Data Highlighter is that you have control over what information is displayed in Google search results. It is worthwhile if you want to highlight specific information in search engine results.
HTML Improvements advises whether there are any issues with your meta descriptions or title tags. For example, it will report meta descriptions that are too short or too long. This page also lists any non-indexable content located on your website.
Sitelinks are used by Google to help people navigate your website. You will see them located underneath website descriptions in Google search results.
Google will automatically generate sitelinks for your website, however you can demote any page that you do not want listed.
The Search Traffic area offers four tools for analysing traffic to your website.
There are two parts to the Search Queries section: Top queries and Top pages. Top queries lists the top keywords and phrases that are directing traffic to your website. It highlights clicks, impressions, click through rate and the average position of the search result. The time period can be changed and you can filter data. All data can be downloaded too.
The other part is Top pages. It lists the pages on your website that are receiving the most traffic. It details impressions, clicks, click through rates and average position. It also shows the change in each statistic. For example, it will highlight that a particular page has received 20% less traffic over a given time period. This makes it a fantastic resource for monitoring the performance of particular keywords and pages.
Links to Your Site
The Links to Your Site section details exactly who is linking to your website. The summary page lists who is linking the most and what pages have the most incoming links. It also shows how data is being linked.
A complete list of who is linking to your website can be viewed online or downloaded via a CSV file. It details the domain name, number of incoming links and the number of linked pages. You can also download all data for the most linked pages and the most frequently used anchor text.
Internal Links highlights how you are linking to your content internally. The data can be downloaded via a CSV file.
Google will sometimes apply a penalty to a website if it believes it to be a source of spam. If your website has been penalised, it will be highlighted in the manual actions section with a description of why a penalty was applied.
The Google Index section offers three tools that focus on the status of your website in Google’s search results.
Index Status allows you to see the number of pages that you have indexed in Google. The graph visualises the growth of your website over time. The advanced tab offers additional information such as the number of pages blocked by robots.
Content Keywords shows the most common keywords that Google finds when viewing your website. Keywords are listed by the number of occurrences found on your website.
Google will group related keywords together. For example, my website reported that my top keyword “blogging” has seven variants: blogging, bloggers, blog, blogs, blogger, blog’s, blogged. Google treats these keywords as equal when listing content keywords.
A robots.txt text file should be used to advise Google how they should crawl your website. However, should you decide to remove a page from Google’s index, you can do so using the Remove URLs tool.
The Crawl section deals with how Google crawls your website content. There are six tools in total.
The Crawl Errors section will advise you of any serious problems with your website such as DNS issues, server connectivity and a missing robots.txt file. It also highlights crawl errors such as 404 pages, not found pages and pages with restricted access.
This is an important tool to use as it will highlight any problems Google has with indexing your pages.
Crawl Stats highlights Googlebot activity on your website over the last 90 days. It shows the number of pages crawled per day, kilobytes of data being used every day and the time spent downloading a page.
Fetch as Google
Fetch as Google lets you see how Google views a page. All you have to do is enter the URL that you want to fetch.
Google will then show you the code it sees when it crawls your page. It also notes how long in milliseconds it took to download the page.
Fetch as Google is a useful tool to use if Google isn’t crawling your website correctly. For example, Google sometimes does not crawl data that is being pulled from external services.
Blocked URLs gives a summary of the pages and directories that you have blocked Google from indexing through the robots.txt file
At the bottom of the page is a tool to test whether the URL’s you have blocked are being blocked correctly. You will find this useful if you do not have much experience with the robots.txt file and want to ensure that your code is functioning correctly.
The Sitemaps section allows you to submit your website sitemaps to Google and test them. The page details the number of pages and images that have been submitted and indexed. The status of individual sitemaps are displayed at the bottom of the page.
Google will note any problems there are with your sitemap. It is important to revisit this tool from time to time and ensure that Google is indexing all content correctly. From time to time, Google may give warnings because a page could not be accessed. I occasionally get these errors for blog posts, however resubmitting the sitemap in question usually resolves the issue.
Certain types of websites display duplicate information. A common example of this is an online store that displays a description about a product on the home page, category listings and product page. URL Parameters lets you inform Google how you want to handle certain parameters so that they can index your content better.
Google lists all parameters that your website uses. You can also add additional parameters.
You can inform Google of parameters that change the content that a user is viewing.
The way that the content of a page is affected can be defined. You can then specify what URL’s Google should crawl.
URL Parameters is an advanced tool that you may need to use if your website platform uses a lot of parameters that changes content. If you own a blog or content based website, it is not something you need to be concerned about.
Webmaster Tools offers a few additional resources at the bottom of the main menu.
The Malware page will inform you if Google has detected any malware on your website.
Google is famous for its experiments and Webmaster Tools is no different. The tools that are listed in the labs section change every so often. Currently there are three tools.
Google+ authorship has become an essential part of promoting you and your websites online. The Author Stats page shows where your profile is being displayed and clicked online. This is a great tool if you are a blogger as you can see where your profile is getting exposure.
Custom Search allows you to create a customised search engine for your website. The layout and style can be customised and you can connect the search engine to your Google Adsense account and earn money.
Instant Previews lets you compare one of your pages with Google’s Instant Preview feature.
Desktop and mobile previews are displayed in the same form they would display in search results.
As I mentioned previously, lab tools change often. Due to this, the tools are sometimes forgotten about. You may be surprised to hear that Google’s Instant Preview feature was actually removed from Google’s search results six months ago; yet the option to see previews through Webmaster Tools still remains. The Site Performance tool that is still listed has been removed too.
I hope you have enjoyed this walkthrough of Google’s Webmaster Tools. As you have seen, it is not a difficult service to understand. By using Webmaster Tools, you can ensure that all of your websites are optimised correctly for Google. If you are not using the service, you are probably leaving a lot of traffic on the table.
If you enjoyed this article, I encourage you to subscribe to WP Hub so that you never miss any of our great articles.