Blog

Google Webmaster Tools – why is it worth using them?

01 October 2012

Google is constantly improving its tools, which have more and more really useful functions. This is also the case with Google Webmaster Tools (Google Webmaster Tools - GWT for short), which are often underestimated by them.

GWT should be the main source of information about the status of his websites for every webmaster. Recent changes in the Google search engine algorithm force us to constantly monitor all aspects related to our websites, including: technical errors, unnatural links or duplication of content within the website.

How to effectively use Google Webmaster Tools to analyze your website?

First of all, if you haven't added your domains to this tool yet, I suggest you do it now? link!

To start working with GWT, you must first verify yourself as the domain owner. This can be done in various ways, through your Google Analytics account (if your domain is connected to it), uploading the appropriate file to an FTP server or adding a line of code to the page.

If you have followed all the steps correctly, we can start!

1.    News

This is currently the only method for Google to communicate any problems that search engine robots encounter when analyzing our websites. In the era of recent algorithm changes, Google sends more and more detailed messages to webmasters. It informs us, for example, about unnatural links leading to our website, indexing problems, large changes in traffic levels, which may be the result of problems with our website.

2.    Site Links

When it comes to website links, you will more often hear the term Sitelinks. This option allows us to add links to subpages of our website in some Google search results. This solution is intended to help Internet users by saving their time and enabling them to quickly find the information they are looking for. More details about the site's links can be found here.


Fun fact: For some time now, for queries presenting website links, only 7 pages are listed on the first page of search results Google SERP Dumps 5.5% of Organic First Page Listings.

3.    Index errors

In this part you will find information about pages that, for various reasons, could not be visited by search engine robots. You will find a division into several types of errors here, but from the SEO point of view, the most important ones will be those marked as: Not found.

These errors mean that the Google robot, when visiting a given subpage on your website, received a 404 error. The reason for this may be an incorrect link or a non-existent page.


Below the chart you will find a detailed list of all errors, you will also be able to download them all to disk and save them in Excel for more detailed analysis. Additionally, in GWT, by clicking on any link from the list, you will receive a pop-up that contains detailed information about each link.

 

Why should you correct 404 errors on your website?

Very often, non-existent pages (such as expired advertisements) already have positioning links that may affect the ranking of our website. So, it is good if in such cases we 301 redirect such pages to other, thematically related pages or categories.

4.    Content duplication

Google constantly strives to improve the quality of search results. Therefore, it is very important today to ensure that there is no duplication of content on our website and that there are no low-quality subpages that have no value for Internet users. Indexation of duplicate and low-quality subpages may have a negative impact on the rankings of the remaining ones. All this thanks to an update of the search algorithm called Panda.

How to find pages with duplicate content?

The best place to start working on content duplication is the "Optimization" category. 'HTML Enhancements'. Here you will find information about the occurrence of double meta titles and meta descriptions. Please review the details to check for potential addresses with duplicate content. In addition to the duplication of meta data, you will find a difference here;including information about too long descriptions and titles or their lack.

5.    Search terms

Another place that should be visited frequently by you due to its SEO value is the "Search Terms" tab. in the "Movement" category.

The statistics of keywords that redirected Internet users to your website are presented here. Average position of phrases, their CTR, number of views and clicks.

By sorting keywords by average position, you can see which phrases are on the first page of search results and which are not. Sometimes, for entries close to the top 10, small changes on the website (on-page optimization) or a few links are enough to significantly improve the ranking in the SERP and thus attract more clicks.


I personally recommend that you also read the statistics after taking into account the changes (tab: With changes). This is a feature that is underestimated by many webmasters and allows you to find really valuable keywords. Thanks to the analysis of changes, you will be able to determine which keywords enjoyed the greatest increase in interest over the last 30 days. By changing the content on specific subpages, such as changing meta titles and descriptions to make them more attractive to Internet users, you can significantly increase CTR.

6.    Remove URLs

This is a very useful function that allows you to manually remove a specific URL address of our website from the search results. There may often be a situation in which Google indexes a subpage that we do not want, and then we can remove it in a few moments by using this option. If we do not want the deleted address to appear in Google again, block robots from accessing it via the robots.txt file or by adding the appropriate code on the website.

 

7.    Site Speed

Google also attaches great importance to the loading speed of our websites. This is an issue that every webmaster should pay attention to. To obtain information about the loading speed of the website, go to "Laboratory" select 'Website performance'. Analysis of the information obtained will allow you to determine whether a given website needs optimization, which will shorten the loading time.

For anyone interested in improving website loading speed, I suggest you read the Page add-on Speed, which will suggest changes and with the Apache module (mode_pagespeed), which automatically optimizes the page and its sources.


8.    Profile linków

This brings us to one of the most popular topics recently, namely external links. After introducing an update to the search algorithm called Penguin, which is intended to penalize websites with unnatural links, many webmasters started analyzing link profiles. Also in this case, Google Webmaster Tools seems to be very helpful.

In the "Traffic" category, then "Links to your website?" you will find some valuable data. Among others: a list of anchor texts of external links, a list of linking domains and subpages on your websites most often indicated by external links.

For a short analysis, I will use the data of the advertising website http://www.morusek.pl< /a>.


For a more detailed analysis, download the list of linking domains. In a few seconds you will receive a clear summary in Excel.


For further analysis, I recommend two Excel add-ins that I use every day in Bluerank, namely:
SEO Tools for Excel and SEOmoz Mozscape Add-In For Excel. These are excellent tools that will save you many hours of work.

In the example below, I used the second add-in, which you can download here. After quick installation, simply open the file with the previously downloaded list of links. Then he will experienceclean the addresses we want to analyze, right-click and select Run Mozscape on Selection.


If you select a large number of addresses, you will have to wait a while. The Mozscape API has a limit of 1 query per 10 seconds. Doing a quick calculation, in the case of 500 addresses you will have to wait about 1.5 hours.

After starting the analysis, a new tab will be created in the file called Mozscape, where the overall summary will be available.

In my case, I further analyzed the Domain Authority of the linking domains. By comparing the quality of the sources of links leading to morusek.pl with the website linked to SWL?i.

From the analysis carried out, it can be concluded that the quality of the sources of links leading to the morusek.pl website is definitely better than the example website linked to SWL?i. Our website profile has a smaller number of links from low-quality websites (Domain Authority 0-30) and a larger number of links from higher-quality websites (Domain Authority 30). If, in the case of your websites, the vast majority of links come from pages with Domain Authority 0-30, it means that you need to work on obtaining links from better sources.

As you can see, Google Webmaster Tools is a really great solution. They have many functions that can help you analyze websites and improve their quality. I hope that I have at least somewhat convinced those who do not use GWT to change their approach to this tool.

Also read

Local SEO – what you need to know

Local SEO – what you need to know

Local SEO is crucial for smaller companies that operate on a regional rather than national level.

11 March 2021