If I work on my SEO, I want to know if it produces results, so I might as well understand right away how to measure these results thanks to the tools that Google provides! The Google Search Console is an indispensable tool that complements Google Analytics. So without further ado, let’s get to it!

Update: Google has recently launched a new version of Google Search Console. And, great news, our guide is wonderfully up to date! Let’s check it out now… :)

Contents

How to create a Google Search Console account?

Enter the URL of your site in the corresponding field and validate by clicking on "Add property".

There are several methods to validate the ownership of your site:

The 1st method displayed by default by Google consists in going through the HTML file (method that consists in having access to the source code of the website).

To do so:

  1. Download the validation file that has a unique number. Save it in your documents or on your desktop, without changing the file name.
  2. Transfer the file to the root of your site.
  3. Verify that the transfer was successful. If so, the imported validation file should display as one of your site's web pages in your browser. The URL of this page must be composed of your domain name, followed by the name of the validation file. For example: http://www.mon-site.com/googleff2a43992774ce75.html
  4. Return to the Google Search Console interface and validate.

Warning: Do not delete the validation file at the root of your domain at the risk of no longer being able to access the data in Google Search Console.

The 2nd method consists of modifying your DNS settings directly on your account with your domain provider (simpler method).

Select "Domain Name Provider" as below > at the bottom of the property validation form.

You'll be redirected to another page.

Choose your domain provider and follow the steps indicated by your provider.

Attention: If you have several sites, you must repeat steps 3 and 4 as many times as you have sites.

Don't forget to register your secure and non-secure site and also your sub-domains.

Example:

  • http://mon-site.com
  • https://mon-site.com
  • https://blog.mon-site.com
  • http://blog.mon-site.com

Linking Google Analytics & the Google Search Console

Open your Google Analytics account and head to “settings”.

Then click on "property settings" and scroll down to Google Search Console and click on "set up search console" → Add → Save

That's it, your Google Search Console account is now configured, you can start optimizing your site!

How to monitor SEO performance?

To do this, go to the "Performance" report, you will be able to track all your traffic, in terms of clicks, impressions, CTR (Click through rate) and average position.

performances-GSC

  • The Clicks filter shows you how many clicks you got in the search results.
  • The Impressions filter shows you how many times in Google results pages users were able to see your ad. (Google counts these impressions based on different factors, and it's not necessarily useful to know all of them).
  • The CTR (Click Through Rate), also known as the click-through rate, is the percentage of clicks your links generate when they appear on Google's results pages. Just click on "CTR" and you'll know everything!
  • The average position allows you to have access to all your positions on the keywords that interest you as well as your average position. This is obviously another index to watch as the closer your positions will be to the 1st position of the 1st page (the dream!) the more traffic you will get.

Are you abnormally badly positioned on certain topics? Do your positions tend to go up and down? Just click on "Position", and you will know everything!

Below this graph, you will be able to filter the data to refine your searches according to :

  • Queries show you the list of your best keywords
  • Pages show you the list of your best pages
  • Countries show you the list of countries from which all searches originated
  • Devices show you which devices were used for the search
  • Appearances in the results show you your enhanced results

Key takeaway: When you filter by pages, you'll get a list of the top pages displayed in the SERP. When you click on a page, it will automatically take into account all the data from that page.

When you then tap on the "Queries" option, it will show you the set of keywords in order of magnitude that the page in question is positioned on.

This report also allows you to filter your data by date, up to 16 months back in the new version of Google Search Console.

Use the performance report to optimize SEO

Boost your CTR

In this case scenario, we will focus on pages with a low CTR but which rank, that is to say which are positioned on the first page of Google.

These are keywords that can bring you traffic!

For this, we will focus on keywords being positioned between 1 and 5 but having a CTR less than 4.88%.

Because according to Advanced Web Ranking, a position 5 has a CTR of about 4.88%.

Advanced Web Ranking

To do this, in "Queries", select by CTR by pressing the button as below then filter on "less than 5":

Boost these CTRs with GSC

Make sure that "Position" is still selected then select CTR and filter on "less than 4.88" (use a “dot” not a comma).

This will give you a list of keywords with potential!

Choose the keywords that have high impressions but low CTR.

Click on this keywords to find the corresponding page.

Then take a look at the keywords that rank on this page.

To do this, click on "+ new" in the performance report and select "Pages" to enter the URL of the page.

You will surely see that the page has a high number of impressions but a low CTR, which confirms your suspicions!

google-search-console-key-page

Now that you have located the page, edit your title and meta description to increase your CTR.

After your changes, monitor your changes using the comparator in the report.

Find keywords

Let’s stay in the “Performance” report. Google Search Console is also a good way to improve your list of new keywords to optimize.

By using the performance report and in a few steps you will be able to find gold nuggets!

For this, nothing could be simpler:

  • Filter dates to 28 days.

Find keywords

  • Adjust the filtering of queries to have positions higher than 8.

keyword queries

  • Sort the keywords by increasing order of impressions.

Those keywords that generate impressions and have not yet reached the 1st page of Google or are barely at the bottom of the 1st page are nuggets.

Make a note of them as the priority to work on in order to improve your positions, increase your click-through rate and ultimately generate sales.

How to deal with indexing errors with the Google Search Console?

In the “index” section of the menu, you will find the “coverage” showing indexed and corrections for those that could not be indexed.

To do this, go to the "Index" section of the menu, then to "Coverage".

The coverage report will look like this:

google-search-console-presentation-report-coverage

With 4 sections: Error, Valid with warnings, Valid and Excluded.

The "Valid" report

This section in green should be checked as a first step.

This section lists the URLs that are indexed and valid i.e. without any error or any form of anomaly with Google.

valid page ratio

google-search-console-detail-report-valid

First of all, check the total number of URLs displayed at the top under "Valid" (here 2.3k). Does this correspond to the number of pages on your site to be indexed? If not, keep this in mind as these missing URLs are most likely in one or more of the other 3 sections.

The graph also allows you to see the progression of the indexing: does it seem consistent? Or on the contrary, is it lower or higher than the reality of content creation on your site? Again, keep this in mind for the future.

Present in the sitemap or not?

Further down, in the subsection " Details ", we see 2 lines : "Sent and indexed" and "Indexed, but not sent via a sitemap".

If your sitemap is up to date and contains all your indexable content, all your valid URLs should be in the "Sent and indexed" section.

=> But sometimes there are inconsistencies to spot and correct.

For this, we will take a closer look at the list of URLs by clicking on the corresponding line.

List of sent and indexed URLs

google-search-console-url-mail-in

If the list contains URLs which should not be indexed: consult the declared sitemap(s) to identify possible errors then proceed to the de-indexation of these unwanted pages.

List of URLs indexed but not sent via a sitemap

google-search-console-url-indexes-not-sending

You have URLs in this subsection because :

  1. the sitemap is not up to date, or it does not contain all your URLs, but only the main ones: don't panic, you just have to update your sitemap(s). This is the case in our example above.
  2. you do not have a sitemap because the size of your site does not justify its creation.

The “Valid with warnings” report

This section in orange should be checked as a second step.

This report shows you the pages for which Google issues you a warning because they are indexed but blocked by the robots.txt file as you can see below.

Indexation warnings

Explanation of the warning

The robots.txt is not a de-indexing tool but a blocking tool, it is possible that some pages are still visible to Google if a third party site links to them.

Interpretation and correction

Click on the "Warnings" section:

Then on the corresponding line of the subsection "Details" to display the detailed list:

  • If the pages are to be indexed: remove them from robots.txt as soon as possible to allow their indexing.
  • If not, then in this case you must remove these pages from the robots.txt, de-index them properly, then put them back in the robots.txt.

google-search-console-banner-load

The error report

This section in red should be checked as a third step. This is the section listing the URLs that Google has not indexed because of errors. Unlike the "Excluded" section, these are URLs that you have chosen to send to Google via a sitemap, that's why it warns you via this error section.

Here in this example we can see that there are 131 errors that have 2 problems.

Click on the section in red "Error":

Indexing errors

In the table below, you will be able to identify the cause of the error in order to correct it.

google-search-console-detail-report-error

Then on each error to have the list of the concerned URLs.

Technical Errors

Here are the different "technical" errors:

  • Server error (5xx): the server did not respond to an apparently valid request.
  • Error related to redirections: the 301/302 redirection does not work.
  • Sent URL appears to be a "soft 404" error : you sent this page to be indexed, but the server returned what appears to be a "soft 404" error.
  • Sent URL returns an unauthorized request (401): you sent this page to be indexed, but Google received a 401 response (unauthorized access).
  • Sent URL not found (404): You sent a URL to be indexed, but it does not exist.
  • Sent URL contains a crawl error: you sent this page to be indexed, and Google detected an unspecified crawl error that does not match any of the other reasons.

For all these errors: correct the error if the page should be indexed or remove it from the sitemap and internal links otherwise.

Indexing errors

  • URL sent blocked by the robots.txt file : this page is blocked by the robots.txt file and sent by the xml sitemap at the same time. Remove it from robots.txt or sitemap xml depending on your intention to have it indexed or not.
  • URL sent designated as "noindex" : you have sent this page to be indexed, but it contains a "noindex" directive in a Meta tag or HTTP header. If you want this page to be indexed, you have to remove this tag or the HTTP header, otherwise, you have to remove it from the sitemap.

You can now update your pages so that there are no more errors and they can be indexed without problems!

The "Excluded" page report

This section in gray should be checked as the last step.
This is the section listing the URLs that Google has not indexed, judging this voluntary on your part. Unlike the "Error" section, these are URLs that you have not chosen to send to Google via a sitemap, so it can not prejudge that it is an error.

Click on the grey section "Excluded":

google-search-console-detail-report-excluded

These pages are usually not indexed, but it seems intentional on your part.

It is therefore interesting to monitor why these pages are not indexed, because you don't want Google to de-index a page you would like to index!

Let's first list the excluded ones for technical reasons.

Technical causes

Blocked due to an unauthorized request (401): An authorization request (401 response) prevents Googlebot from accessing this page. If you want Googlebot to be able to crawl this page, remove the access credentials or allow Googlebot to access your page.

Not Found (404): This page returned a 404 error when requested. Google detected this URL without an explicit request or sitemap. Google may have detected the URL via a link from another site or the page may have been removed. Googlebot will probably continue to try to access this URL for some time. There is no way to tell Googlebot to permanently forget about a URL. However, it will crawl it less and less often. 404 responses are not a problem if they are intentional, just avoid linking to them. If your page has been moved, use a 301 redirect to the new location.

Crawl Anomaly: an unspecified anomaly occurred when crawling this URL. It may be caused by a 4xx or 5xx response code. Try to analyze the page with the Explorer tool like Google to see if there are any problems preventing it from being crawled and then loop back with your technical team.

Soft 404: The page requested returns what appears to be a "soft 404" response. In other words, it indicates that the page cannot be found in a user-friendly manner, without including the corresponding 404 response code. We recommend that you either return a 404 response code for "not found" pages to prevent them from being indexed and removed from internal linking, or add information to the page to tell Google that it is not a "soft 404" error.

Causes by a duplicate or a canonical

Other page with correct canonical tag: this page is a duplicate of a page that Google recognizes as canonical. It correctly links to the canonical page. In theory, there is no action to take with Google, but we recommend that you check why these 2 pages exist and are visible to Google in order to make the appropriate corrections.

Duplicate page with no canonical tag selected by the user: This page has duplicates, none of which are marked as canonical. Google thinks this page is not canonical. You should designate the canonical version of this page explicitly. Inspection of this URL should indicate the canonical URL selected by Google.

Duplicate page, Google did not choose the same canonical URL as the user: this page is marked as canonical, but Google thinks that another URL would be a more appropriate canonical version and has therefore indexed it. We recommend that you check the origin of the duplicate (perhaps you should use a 303 instead of keeping both pages), and then add the canonical tags you need to be accurate with Google. This page was detected without an explicit crawl request. The inspection of this URL should indicate the canonical URL selected by Google.
If you have this message on 2 different pages, it means that they are too similar and that Google does not see the point of having two. Let's say you have a shoe sales site, if you have a "red shoes" page and a "black shoes" page that contain little or no content, or content that is too similar, with only the title changing: you have to ask yourself if these pages should really exist, and if so, improve their content.

Duplicate page, the sent URL has not been selected as canonical URL: the URL is part of a set of duplicate URLs without an explicitly stated canonical page. You requested that this URL be indexed, but because it is a duplicate and Google thinks another URL would be a better canonical version, it was indexed in favor of the one you declared. The difference between this state and "Google did not choose the same canonical page as the user" is that in this case you explicitly requested indexing. Inspection of this URL should show the canonical URL selected by Google.

Page with redirection: the URL is a redirect and therefore has not been added to the index. There is nothing to do in this case, except to check that the listing is correct.

Page deleted due to a legal claim: the page has been removed from the index due to a legal claim.

Caused by indexing directives

Blocked by a "noindex" tag: when Google tried to index the page, it identified a "noindex" directive and therefore did not index it. If you don't want the page to be indexed, you have done it correctly. If you want it to be indexed, you must remove this "noindex" directive.

Blocked by the page removal tool: the page is currently blocked by a URL removal request. If you are a validated site owner, you can use the URL Removal Tool to see who is making the request. Removal requests are only valid for 90 days from the date of removal. After that time, Googlebot may crawl your page again and index it, even if you don't send another indexing request. If you don't want the page to be indexed, use a "noindex" directive, add access credentials to the page, or delete it.

Blocked by robots.txt: A robots.txt file prevents Googlebot from accessing this page. You can check this with test tool. Note that this does not mean that the page will not be indexed by other means. If Google can find other information about the page without loading it, the page might still be indexed (although this is a rare occurrence). To ensure that a page is not indexed by Google, remove the robots.txt block and use a "noindex" directive.

Crawled, currently not indexed: the page has been crawled by Google, but not indexed. It may be indexed in the future; there is no need to request indexation for this URL again.
This happens quite often with pages paginated after the first page, because the engine does not see the point of indexing them in addition to the first.
It is also possible that it concerns a large number of very similar or low quality pages, for which Google does not see the point of indexing them. We must therefore ask ourselves if it is not better to de-index them voluntarily, unless we plan to work on them in the near future.

Detected, currently not indexed: the page has been found by Google, but has not yet been crawled. Generally, this means that Google tried to crawl the URL, but the site was overloaded. Therefore, Google had to postpone the crawl. This is why the last crawl date is not shown in the report.
This happens quite often with pages paginated after the first page, because the engine does not see the point of crawling them in addition to the first page.
It is also good to dig into the page’s depth: when you have many deep pages, it is difficult for the robot to crawl your site, so it decides to ignore part of the site. This problem must be corrected as soon as possible because it can affect the overall crawlability of the site including other key pages for your SEO.

You know all about the Search Console excluded URLs report!
A SmartKeyword consultant can help you to audit your indexing coverage, do not hesitate to contact us!

The URL inspection tool

The inspection tool gives you information about a specific page: AMP (Accelerated Mobile Pages) errors, structured data errors and indexing problems.

To access this report, insert the URL you want to inspect in the search bar.

With this report you can also :

  • Inspect an indexed URL: to gather information about the Google indexed version of your page.
  • Inspect a live URL: to determine if a page on your site can be indexed by clicking the "test URL online" button in the top right corner.
  • Request a indexing: to ask Google to crawl an inspected URL by clicking on "Request an indexing" in the first section of the report.

Manage links using the Google Search Console

Manage your netlinking: Which sites link to your site?

To do this, go to the menu in the "Links" tab.

The information on netlinking is in the left column named "external links".

You will find 3 reports:

netlinking-GSC

  • Main landing pages: lists the pages having external links pointing to them.
  • Main origin sites: lists external links pointing to your site.
  • Main anchor texts: anchors used on these external links.

Will I see all my links?

Not all your backlinks will be listed. This is normal, don't worry! Here are the reasons why:

  • Problem on robots.txt The data displays the content detected and explored by Googlebot during the crawling process. If a page on your site is blocked by a robots.txt file, the links pointing to that page are not listed. The total number of these pages is available in the crawl section of the "Blocked URLs (robots.txt files)" tab in your Search Console.
  • Problem on your 404 pages If a non-functional or incorrect link is detected on your site, it is not listed in this section. >> We recommend that you check the Crawl Errors page regularly for 404 errors detected by Googlebot when crawling your site, which will ensure that if an external site links to you, you will benefit from the popularity boost!
  • Google hasn't crawled it yet! Yes, Google may not have analyzed the referring site’s page yet, in which case it may not have seen that it links to you! Be patient, it will happen soon!
  • Your site may be indexed under https or with another version (with or without www).. For example, if you don't see the expected link data for http://www.example.com, make sure you have added a http://example.com property to your account, then check the data for that property.

Pro tip: To avoid this, set a favorite domain

Manage your internal links

The number of internal links that redirect to a page allows search engines to determine the importance of that page. If an important page is missing from the list or if a less important page has a large number of internal links, it may be useful to review the internal structure of your site.

GSC internal links

That's all well and good, but how do you use it?

  • Highlighting a certain page

If you want to make sure that a page of your site is well linked (that it has multiple internal links from other pages), you can make sure via this menu, it's beautiful!

  • Delete or rename pages

If you want to delete or rename pages on your site, check this information beforehand to identify any non-functional links and avoid this type of problem.

  • If no information is displayed in this report...

This may mean that your site is new and has not yet been crawled. If not, check the crawl error report to see if Google has encountered any problems crawling your site.

How to manage a website’s mobile ergonomics?

Worldwide web traffic from mobile devices is growing rapidly. And recent studies show that mobile users are more likely to return to mobile-friendly sites. Google can help you with this!

To do this, go to the "Improvements" section of the menu and then "Mobile Ergonomics".

In the graph, find the mobile usability problems detected over time on your site.

Here are the different problems you may have:

  • Uses incompatible plug-ins
  • Display window not configured
  • Display window not set to "device-width", i.e. the window cannot adapt to different screen sizes.
  • Unreadable text, because it is too small

Clickable elements too close together

Go further with the Google Search Console

As the old Google Search Console is still available, you can go even further in your website optimizations with the following reports:

HTML Improvements

This report shows you the possible difficulties encountered by Google when crawling and indexing your site. Consult it regularly to identify changes that could improve your ranking in Google's search results pages, while ensuring a better user experience for your visitors. For example, below you can see that there are duplicate meta descriptions.

amelioration-html

Manual actions

This report found in the new Google Search Console lists known problems on your site and provides information to help you solve them.

manual action

International Targeting

This report allows you to manage one or more websites designed for multiple countries in multiple languages; you should ensure that the version of your pages that appears in search results is appropriate for both country and language.

ciblage-international

Structured data

This report that you can find in "Appearance in search results > Structured data", gives you the list of URLs of your site that contain structured data: it is now possible to click on each URL to know if Google has taken into account your markup: (fantastic no??)

data-structure

Data Markers

"Appearance in search results > Data markers": this is a tool that allows Google to interpret the format of structured data on your website.

URL settings

"Crawl > URL Settings": this report allows you to tell Google what your site's settings are for and how to interpret them.

parametre-url

Conclusion

All these features explain why webmasters agree that Google Search Console provides a very interesting complement to Google Analytics, with once again the double objective of achieving both a better understanding of where your traffic comes from and a way to strategically increase that traffic.

What's next ?

You have just discovered with Smartkeyword how to use search console discover now how to add a user on Search Console.

class="img-responsive
   Article written by Louis Chevant

Further reading

The complete guide to using the Google Search Console

Google Search Console: free tool provided by Google to analyze the SEO performance of your website.