To understand what a crawler is, it is necessary to remember how Google and other search engines work. They send small robots (in other words, small computer programs) to all sites that make up the web. The bots enter and browse the sites via the links they find on their way. A link is in a way a gateway for the robots (bots). A crawler, or crawling software, is based on the same principle, except that it is not controlled by Google, but by us, site editors or SEOs. It consists of robots simulating the same behavior as those of search engines, which we program to visit our site or that of our competitors. It is possible to crawl entire sites or specific pages. The objective of a crawler is to detect the different structural anomalies of a site but also to evaluate its performance, its incoming links, etc. Combined with the logs, this data is a mine of information for all site owners. Even if we often talk about crawlers in the context of SEO, there are other types of crawlers, which can be used by marketing teams.

Contents

 

crawler-site-web

What are the different types of crawler robots?

Indexing for web search engines. 

As mentioned above, search engines use their crawlers every second to evaluate websites and build their search result rankings. The passage of the crawlers is visible in the logs.  

Diagnostic crawler

These crawlers are provided with SEO tools. They can be run either internally (on your own site) or on third-party sites. There are also crawlers that crawl the web to help you make an inventory of your competition, your link profile, etc.  

Price monitoring

One of the most used business strategies is price monitoring. Keeping up to date with the pricing practices of your competitors is essential to set up your own pricing policy. To do this, there are tools that use crawlers to retrieve data on product prices. Some large marketplaces have even set up crawlers of this type within their platform, to allow e-merchants to adjust the prices of their products according to their competitors present on the marketplace.

crawler-seo-watch-price list

SEO practices black hat

Some Black Hat SEO use crawlers to automate tasks to manipulate Google's algorithms. One of the best known (but in decline) is undoubtedly the automation of links on blog comments. As SEOs, we do not agree with this kind of practice.

How to please crawler robots such as Googlebot or Bing?

To please the crawlers of Google and other search engines requires you to take care of the appearance of your site on many levels. Performance, content quality, accessibility, netlinking: each SEO component is an evaluation criterion for the robots. You must therefore ensure that you offer unique content, properly integrated into a coherent internal linking that prevents the bots from getting lost within the site.

The role of the crawl in SEO strategy

Thinking about your site internal linking is essential to control the bots’ path and optimize the crawl budget. But without an appropriate crawler, it is difficult to visualize the different nodes of your linked pages and thus make the necessary optimizations. The structure of a site is often much more complex than it seems, especially on e-commerce sites which generally offer faceted navigation which makes the task much more complex. In addition, an external crawler is also essential to know precisely which sites link to yours (backlinks). This data has a double benefit: to improve your link profile and to prevent black hat actions.

crawler-seo-strategy

Should you use a free or paid crawler?

There is no universal answer to this question. It simply depends on the type of site, the level of knowledge of the site owner in terms of SEO and financial resources.  

The free crawlers

  • Xenu is a crawler whose main objective was to detect broken links on a site. But it now shows other interesting data such as title length, page depth, image weight etc. It is an interesting tool to start getting familiar with the basics of SEO.
  • LinkExaminer is, as the name suggests, a link checker that scans each page and analyzes its HTML code to extract existing links. It can also perform tasks like extracting the page title or identifying duplicate pages.
  • Microsoft's Free SEO Toolkit is a feature built into Windows, but not added by default. It allows you to scan websites for content relevant to search engines. It can also check links with or without 'noindex' and 'nofollow' tags, page titles, meta tags, images, etc. It is a little more interesting for beginners, but it quickly shows its limits when you want to make big technical optimizations.

Paid crawlers

  • Screaming Frog: free up to 500 urls, Screaming Frog is a good tool that allows you to scan the URLs of websites and retrieve key elements to analyze and audit any website. However, the data is quite raw and needs to be exported into an Excel table to be exploited.
  • Botify and Oncrawl: these two crawlers integrate a crawling system but also a log analysis system. The data is represented in the form of graphs, pie charts or curves, which facilitates their interpretation. Botify is undoubtedly the most advanced tool, but it is quite expensive and requires advanced skills. Oncrawl is more affordable, both technically and financially.
  • Deep crawl is a crawler that integrates a cloud that you control. It allows you to crawl your own site, but also its backlinks. It also gives an overview of the structure and site plans. DeepCrawl has developed its own metric: the Deeprank, in charge of measuring the weight of internal links as Google would do.
  • SEMRush is a very complete and well known tool for keyword research, competitor tracking, etc. It is also able to detect opportunities for ranking on the search engine. It is also able to detect backlink opportunities and can even perform an SEO audit of a site (but it will never be as successful as those of crawlers specialized in structural analysis).

 

Conclusion

The crawlers are therefore ubiquitous on the web and have become indispensable allies for any site owner wishing to optimize his site and monitor the competition. If some free tools are rather interesting to start with, it is generally necessary to invest in more powerful crawlers that will offer much more advanced features. It is an investment that will quickly become profitable because you will be one step ahead of your competitors.

class="img-responsive
   Article written by Louis Chevant

Further reading

The SEO technical audit guide

Step by step method to carry out a technical audit of your site.