The improvement of its natural referencing is based on a multitude of criteria. We often hear about loading time, internal meshing, netlinking, but too little about one essential criterion: the depth of the site.

In simple terms, the depth of a site corresponds to the number of clicks between a given page and the homepage. For example, if you’re on a product page and it is accessible after 4 clicks from the homepage, we consider that this page’s depth is 4. With the widespread use of Javascript, it is not always obvious to find the exact number of clicks. You can use an extension like WebDeveloper which allows you to disable the JS and see what your site looks like without the "JS shortcuts". For larger sites, it is recommended to use a crawler (Botify, Oncrawl...). In any case, the depth of a site plays a major role in a site’s ranking. We explain why and especially how to reduce it as much as possible.


Page depth affects SEO performance

To understand the importance of depth in SEO, we need to understand how the different search engines’ robots work, especially Google. To do this, let's compare your site to a tower of champagne glasses (yes, like in lavish receptions!). The first glass is your homepage; the glasses at the bottom are your deepest pages. Your "SEO juice" is poured from the first glass. Inevitably, the cups that are close to the top cup (i.e. the homepage), are the ones that receive the most juice. If your tower is too high (= your site is too deep), the bottom cups (= the deep pages), will hardly receive any SEO juice. They will therefore not be valued by Google. If you have set up a strong internal linking, as we will see below, you can hope to compensate for this. But it is always better to reduce the depth of your site while strengthening your internal linking. 


What is the best depth distribution for a website?

If we often hear that in general we should not exceed 3 to 4 levels of depth, the reality is quite different. The expected depth level depends on the type of site and its theme. Indeed, small niche sites can see pages appear in the SERP beyond level 3. E-commerce sites with a large number of pages are the most prone to depth problems, particularly because of the faceted navigation (more commonly called "filters"). The latter, often associated with pagination problems, can generate extreme depths without even being noticed. (example taken from


How to improve a website’s structure?

Use internal linking

Sometimes it is difficult to be satisfied with 3 levels of depth, especially when you have many sub-categories of products. As mentioned above, internal linking can then partially correct this problem. Let's go back to our example of champagne glasses. Imagine that the first glasses at the top are full and those at the bottom are completely empty. You can shift the balance by sending juice from the full glasses to the empty ones. The deep pages are then no longer solely dependent on the juice sent from the homepage. In other words, you distribute the juice more evenly throughout your site. You can also supplement this juice with backlinks, which feed your deeper pages. It's a bit like adding a new bottle of champagne, but starting from a lower level. You "skip" the homepage, but feed the deeper pages.

internal depth-of-site-grid

Use a breadcrumb trail

The breadcrumb trail is a landmark for users but also for robots. Its links, placed at the top of a page, make it easy to navigate and go up the tree structure. Thus, if you are in a very deep page, generated for example via filters, the breadcrumb trail will allow the robots to go up more easily to the headings and categories. However, the breadcrumb trail is especially useful for ascending and not descending navigation.

Take care of your pagination

Pagination is a plague in SEO and is often the first source of too much depth. There is no universal solution to this problem. It depends on the site and its complexity. If some successful sites opt for the infinite scroll, others prefer to rely on the rel=prev / rel=next technique combined with "follow, noindex". Some publishers have chosen to completely revise the structure of their site to correct the root of the problem and drastically and naturally limit the pagination. We therefore invite you to consult our guide to pagination to understand its impact on the depth of a site.

Website depth: how to spot issues and fix them?

What is the maximum click depth of the site?

First of all, check the maximum click depth:

        • If it is less than or equal to 4, everything is good, you do not have a depth problem.
        • If it is higher than 4, go to the next point.

What is the percentage of URLs concerned by a depth greater than 4?

Do the deep pages concern 20, 40 or 70% of your site’s pages? The more pages are concerned, the higher the impact and therefore the priority to fix it.

Then check the list of affected pages, and verify in which case you are. 

1. Does the pagination need to be optimized?

If your site contains several products per category, the pagination can generate a very important depth. Let's imagine that you have 30 product pages within the same category, if you have an incremental pagination of the type " 1, 2, 3, 4, ..., 29, 30 " or " previous, next " : your site will have a depth of 31 minimum.

Locate these pages on the list and then change the pagination set up.

They can be both paginated listing pages and product pages, since the latter come just after the listings, they are consequently deeper.

The same thought applies for blogs: let's say you have a wordpress blog with incremental "previous/next" pagination and you publish 5 articles per week or so. In this case, the depth will not stop increasing and the article that was very successful in June 2017 will be forgotten by 2020: its ranking will drop because the links it receives will be too deep and diluted.

Locate these pages on the list and then change the pagination set up.

2. Does the site structure need to be optimized?

If it is not a pagination problem, then it is the structure of the site itself that needs to be reviewed: a large depth indicates that there are many levels. In this case, you must ask yourself if the intermediate categories are useful for SEO, and replace them with dynamic filters for example to reduce the number of levels.

A large depth is also a hindrance in ergonomics for the user.


The depth of a site is therefore an essential criterion to take into account for SEO. Keep in mind that all the efforts you can provide in writing or netlinking will be nothing if your pages do not receive enough SEO juice! If you are having too much trouble reducing your site depth, it may be because your structure is not optimal. Sometimes it's better to ask the right questions (even if they challenge your beliefs) and fix the root of the problem, rather than putting patches everywhere. 

   Article written by Louis Chevant

Further reading

The complete guide to Internal Meshing

The step-by-step method to build your semantic cocoons, your mesh and the optimal tree structure of your website.