3rd July 2014

Google Webmaster Tools - Part 3: Getting Results

Mike Davis
Lead Developer

In part one of this series I gave an overview of Google Webmaster Tools.

In part two I had a look at a bit more detail at some of the areas that I have found useful to understand when reviewing a site's listings in Google. In part three, I am going to look at how, having analysed this data, you can help to improve your site's listing in Google.

As mentioned in part one I am not an SEO expert, so this shouldn’t replace consulting/using SEO experts, but it will hopefully help you to be able to cover the basics without having to spend loads of money.

Search Traffic

This section wasn’t covered in part two of this series, but is hopefully fairly self explanatory and will help you analyse where to perhaps focus your SEO. Search Queries: This shows the list of most searched terms that have driven traffic to your site. This can help you target various key pages etc, and help you think more about the wording of your pages, to try and help you get more natural listings in Google.

Crawl details

Having had a look at the various data within the crawl details sections, you should then be able to understand pages that are generating errors on your site and look at what needs to be done to fix them. Key errors that you should concentrate on are ‘Server errors’, ‘Soft 404 errors’ and ’Not found’ pages.

Sitemaps

Looking at the sitemaps section can be slightly addictive as you watch the graph (hopefully) increase as Google indexes more and more of your content. Sometimes you take a look at it (thinking that everything is working alright) and see a problem with the number of pages being indexed. We had a client that I was working with a while ago and when we looked at their sitemap in Google Webmaster Tools, the number of pages being indexed fell well short of the number of pages being sent to Google. Their sitemap.xml file contained around 240,000 URLs, but only around 10,000 were actually indexed. Looking at the site, it was set up to send Google a new sitemap.xml every day. By changing this to send it to them weekly, this increased to a regularly indexing of around 225,000 URLs. It seems that Google wasn’t able to index all the pages in just one day and kept starting again each day, so was never getting to the end of the list. Google says that they can’t guarantee to index all of your site, but to have over 90% over the pages indexed was a lot better than only 4%.

That's all folks!

Although I haven't covered Google Webmaster tools in great detail, hopefully you will now feel more confident looking through and using it. It is a great tool for analysing your sites to see how you can get better performance from them in terms of user experience and for your search rankings.