Google Webmaster Tools, or Webmaster Central, is to be considered on of the more useful tools in the SEO toolkit that google is providing. Some of its features include showing how Google crawls, indexes and rank your site, the way people find your site in search result and the CTR on the clicks that follow by those result pages. Also you can find out, although limited, who is linking to you and to what pages, receive (email) notifications or alerts when your site contains malware, your WordPress installation needs an update or other issues that have been detected.
So why should you use Webmaster Tools? Some of its features mentioned earlier already indicate why it might be helpful to verify your site and look into it periodically. Let’s go into some of the features that are available and understand how they can be of help to basically any webmaster or SEO.
The dashboard will provide you with a brief overview of some selected metrics. It shows a top list of the most important queries, the amount of impressions and number of clicks, links to your website, crawl errors, keyword visibility and information about your sitemaps.
This section displays any messages about site vulnerabilities, new verifications, changes to sitelinks, etc.
In the site configuration section you will find a page dedicated about sitemaps. On this page you will be able to submit sitemaps. Google Webmaster Tools will tell you whether the sitemap was available, when it was last downloaded, how many URL’s were submitted and how many URL’s were indexed.
Provided the information Google is providing about the URL’s provided versus indexation rate is fairly accurate this can be a very useful tool! If you want to know which parts of your websites are having trouble getting indexed you can use this data. To get better insight you should provide multiple sitemaps. Each one should contain URL’s of a particular section of your website (e.g. per language, categories, and/or product assortment). Approach may differ per website structure. The result should be a fairly good idea on what particular parts of your website are underperforming by index rate.
Section providing details on how your robots.txt was found and to check if it was working as expected. Here you can also specify URL’s that you would like to see removed from the web indexed by entering them manually at the Remove URL tab.
Sitelinks are links that appear underneath the regular URL on specific queries. Sitelinks will start to appear for your website automatically for specific queries (most likely branded search queries). Sitelinks will affect the CTR for your result in the search result pages.
Select available sitelinks carefully and eliminate those that are not of value to prospect visitors (e.g. with wrongly setup pagination on your website it is likely to have Sitelinks show ‘Page 1′, ‘Page 2′, ‘Page 5′, etc)
Change of address
Firstly, do read the guidelines for moving a site to a new domain. Anyone experienced with moving websites to a new domain knows that a bad migration of site A to site B can become a real nightmare if for example redirects are not properly taken care of. Complete all steps in this section to make sure your set.
Very useful section, for example to exclude parameters (parameter handling tab) that you do not wish to see indexed (e.g. sort options, filters, any parameter that may cause duplicated content). In the main section you can set the Geographic target of your website and your preferred domain (www or non-www). If necessary you can also tell google to reduce the crawl rate for your website.
Section shows you top queries and pages for your website. The number of impressions and clicks, calculated CTR and periodical change and your ranking position in the search engine result pages for those keywords or URL’s.
Links to your site
Shows the number of links to your website as indicated by Google. The number of domains that link to your website and the anchor texts that are being used for those links pointing to your website.
Keywords that have been detected the most while crawling your website. It can give you an idea if the content of your website matches your theme. Interesting feature is that at drilled down level you can see which variants of keywords Google has encountered (and thus detects as being similar..)
This section shows you the pages that have been most linked to internally. The number of links pointing to internal pages is a way of telling search engines which pages are most important to you. Probably this will show a number of site wide links but can give you an indication of possible duplicate content issues as well when you see high number of internal links appearing for specific pages.
Shows you the number of subscribers to your RSS feed when detected.
This is the first part of the newly introduced +1 Metrics section within Webmaster Tools. First of all it provides you with an advertisement to add +1 to your website (might be removed). It will then show you the number of +1s for your website URLs and the impact on CTR of having or not having the +1. Enough data for the comparison must be available to show these estimates.
Shows where +1 was being used, both on and off (to) your site periodically.
This metrics reports the usage of +1 on your website and when enough data has been collected some anonymized data about the number of unique users, their location and their age and gender when available.
NB:Google recently has begun to proactively pitch to AdSense advertisers to put +1 buttons on their websites, probably to get better demographics via their publisher network.
If any malware was detected on your site it will be reported here.
Reporting in this area will tell you what URL’s Google was unable to detect and where they were linked from, whether these are external pages or located in your sitemap. In a seperate that it will show pages that were restricted by robots.txt, show 404 not found errors or unreachable (e.g. server error 50x status code).
Shows a graph of Googlebot activity over the past 90 days. Number of pages crawled per day and the time spent downloading a page are good indicators of how healthy your website is in a Googlebot perspective. If you manage to decrease the time spent downloading a page you will see this reflected in an intake of the number of pages per day. Good way to start on this is to optimize your database queries and enable some sort of caching mechanism on your website (more info in our post on State of Search: What is really important in Technical SEO.
Fetch as Googlebot
Tool that will show you how Googlebot fetches your page, you can enter any page from your verified domain and see the HTML (100KB) and HTTP response that Googlebot encountered during its crawl.
Particular useful section because it provides information about issues that were detected on your website. Non-indexable content for example or duplicate title tags and meta descriptions. Correcting the issues listed here should help your site’s performance because they add value to the user, by better displaying ‘what this page is about’ in Search Engine Result Pages.
Snapshot tool to provide you with information how Google sees your site. Indicated that if it differs from what users see there might be crawling issues on your website.
Caveat: Flash items or User-Agent check might cause some disruptions in the snapshot created in this particular overview. Be aware that User-Agent checks can be subject to cloaking, be sure to check Google’s Quality Guidelines.
Above mentioned items in Webmaster Tools provide a lot of information about your website through the eyes of Google. But what items should you really be paying attention to?
First of all, any technical issues with the website should be fixed as soon as possible. If technical issues on your website are preventing Googlebot from fully spidering your website you will have less pages to benefit from in the SERPs. So really pay attention to technical issues that Google is reporting via Webmaster Tools.
Google is continuously putting effort in making ‘a faster web’. Your site should be fast as well. Also from a visitor perspective – visitor happiness is in turn what Google believes the most important KPI’s of a successful website.
Study the site performance overview in WMT and the crawl stats overview. Decreasing the loading time of your website will mean greater benefits. Googlebot will only spend a certain amount of time on your website, consuming a number of pages during that time. Increasing the number of pages that Googlebot will be able to take in means more pages crawled per day.
Use tools like YSlow!, Google Pagespeed or Webpagetest.org to see where you can improve on your website.
If you want to make sure that all your pages rank well in the searchengines there is just one answer, you need linkjuice that will reach all of your pages, top to bottom, bottom to top. Use the links to your site overview and also pay attention to how people are linking to your website. Make sure you have a diverse linkprofile, including anchor texts.
Click Through Rate
The Click Through Rate for your website is an indicator for Google to know whether the information displayed in the SERPs matches the intent of the user’s search query.
See the Search Queries overview and watch your performance over time once you start making changes to your website titles or metadescriptions or when certain keyphrases allow you to have rich snippet extensions (reviews, place or person information, etc )
Social / +1
Pay attention to the +1 Metrics section as this gives you a good impression of what your (target) audience really likes. Are your most +1′ed pages your most popular pages? Do they get credit for being +1′ed so much? Will other pages benefit if you link to these pages? What happens to the CTR of your site in the SERP results once people have +1ed your products or articles?
Preparing a schedule
Make sure you reserve plenty of time for making adjustments to your website. Technical issues should be fixed first, from experience as a consultant, these sometimes may take a very long time before these have been implemented. In the mean while you can work on other issues that appeared in WMT. Make sure your content team can work on the duplicate titles and meta descriptions (if its not a technical fix). Put all your project actions in a reasonable timeline and use daily 5 minute stand up meetings with your team (agile anyone?) to discuss what will be done this day and what is to be finished by next morning. Projects done? Re-iterate.
Do you know of methods that will help others in leveraging Google Webmaster Tools features? For example by using 3rd party tools to combine Google Adwords data, or your experiences with Webmaster Tools integration with Raventools . Do you have a feature request? We are also interested in hearing those so please let us and all our readers know!
PS: Did you know Google Webmaster Tools now warns users to update their WordPress installation?