Technical SEO: the ultimate guide

When you start to work on your SEO strategy, there is obviously the on-page content to take into account, in other words the content’s optimization on the website’s page. But you also have to take into account the whole technical aspect of SEO. It's like renovating a car from the outside but forgetting to change all the internal parts.

Find out how to optimise your SEO with our SEO techniques.

Contents

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

The 24 chapters SEO Techniques

A technical SEO audit is one of the keys to a good SEO strategy. It helps you check the website’s functioning in order to correct any technical errors that you may find.

  • What tools to use for a technical SEO audit?
  • Basic technical check-ups
  • Web pages’ SEO audit
  • Indexing check-up
Read more

A breadcrumb is a small path of text, usually located at the top of a page. This lets the user and the robots know where they are based on the sitemap.

  • The name's history and origins
  • What are the different types of breadcrumb trail?
  • What are the advantages of the breadcrumb trail for SEO?
  • How to implement the breadcrumb trail?
Read more

The canonical URL is a tag used to indicate to a search engine the “main” URL to be taken into account for indexing, when the same content is present under different URLs. This will prevent the engine from choosing for you.

  • Canonical URL: definition
  • How to optimize canonical URLs?
  • How to set up a canonical URL and what added value does it bring?
  • Canonical URL: mistakes to avoid
  • Solutions/tips for using the canonical URL properly
Read more

The meta robots tag is used to decide which pages should be seen and crawled by search engines, and which should be hidden and prohibited from indexing.

  • What is the meta robots tag?
  • How to set up the meta robots tag in the header?
  • The meta robots tag’s different directives and their impact
  • Meta robots tag: most common mistakes
  • Ensure the meta robots tags’ compliance on your website
Read more

It’s necessary to use the hreflang tag if you have several versions of a website or only some content in different languages or if some pages have regional variations.

  • What is the hreflang tag?
  • Hreflang and SEO: why is this tag so important?
  • Where should the hreflang tag be placed?
  • Hreflang integration mistakes to avoid
  • Best practices for successful implementation of your hreflang tag
  • How to check the correct implementation of your hreflang tags?
Read more

The 404 page is often displayed with a message: 404 error, 404 file not found, file not found, or the page no longer exists.

  • What is a 404 error?
  • What are the causes of the 404 error?
  • What are the 404 errors’ impacts on organic ranking?
  • How to detect and fix 404 errors?
  • Custom error pages
Read more

A 301 redirect can be compared to a mailing address’ change.

  • What is a 301 redirect?
  • Why do a permanent redirect?
  • When to do a 301 redirect?
  • How to do a 301 redirect?
  • Common errors on 301 and 302 redirects
  • 302 or 301 redirect?
Read more

According to recent estimates, 1% of all websites are secured with an SSL certificate, and 40% of organic top page search results in Google have an HTTPS website.

  • What’s the difference between HTTP and HTTPS?
  • Why should you switch the website to HTTPS?
  • How to migrate to a secure HTTPS site?
  • How do you fix unsecured pages?
Read more

The Robot Exclusion Protocol, better known as robots.txt, is a convention designed to prevent web crawlers from accessing all or part of a website.

  • What is robots.txt?
  • How does it work?
  • Why do you need robots.txt?
  • Robots.txt syntax
  • Where to put the robots.txt?
  • How to create a robots.txt for a website?
  • Robots.txt & Search Console: validate and test
Read more

How to create a good sitemap? We will give you all the tips in this article.

  • What is a sitemap?
  • What can a sitemap contain?
  • How to create a sitemap?
  • Configure the sitemap in Google Search Console
  • Our advice to optimize its use
  • Common mistakes when generating the sitemap
Read more

To understand what a crawler is, it’s important to remember how Google and other search engines work. The latter send small robots (in other words, small computer programs) to all the websites that make up the web. Bots enter and browse websites through links they find in their path.

  • What are the different types of crawler robots?
  • How to please crawler robots such as Googlebot or Bing?
  • The role of the crawl in SEO strategy
  • Should you use a free or paid crawler?
Read more

If you chose to put page load time aside, it is a big mistake! Indeed 40% of users leave a page that does not load after 3 seconds. Moreover, this load time must be taken into account because of the expansion of mobile traffic.

  • How does page loading work?
  • Why improve page loading speed?
  • What tool to test a website’s speed?
  • How to speed up page loading?
Read more

Optimizing pages is not enough if the links between them are not optimized. That’s why building your internal linking in the form of semantic cocoons is essential. In this guide, we'll tell you why and how to put it all together!

  • What is meant by "links"?
  • Why are they important in SEO?
  • Internal links: identify technical problems with internal linking
  • What are semantic cocoons?
Read more

A website’ depth is the number of clicks between a given page and the homepage. For example, if you are on a product page and it’s accessible after 4 clicks from the homepage, then its depth is considered to be 4.

  • Page depth affects SEO performance
  • How to improve a website’s structure?
  • Website depth: how to spot issues and fix them?
Read more

The sitelinks present on Google first appeared around 2005. They went through many iterations before becoming sitelinks as we know them today.

  • Sitelink definition and visual presentation
  • Why are sitelinks important?
  • How to get Google sitelinks for a website?
  • The main questions around sitelinks
Read more

You’ve surely heard that pagination can be detrimental to a website's SEO: this is both true and false, because the problem is not with pagination itself, but with the way it is implemented.

  • What is pagination on a website?
  • Why set up a pagination?
  • How to set up a pagination?
  • Good pagination practices for SEO
  • Common pagination errors
Read more

We will give you all the tips to fix duplicate content. You will get rid of duplicate content which can unfortunately affect SEO.

  • Duplicate content definition
  • How does duplicate content occur within a website?
  • How to check for duplicate content?
  • How to fix or remove duplicate content?
Read more

Log analysis is a technique regularly used by SEO professionals. It gives a global vision of a website's performance, its internal linking and its impact on robots’ behavior.

  • What is a server log file?
  • What are logs?
  • What information can be extracted from it?
  • What is the crawl budget, and how to optimize it?
  • Identify page optimizations
Read more

The first forms of domain extensions, also known as TLDs (Top-Level Domain), were primarily intended to classify different domain names according to their function. This goal is still accurate although it’s less and less respected by website owners, out of ignorance or on purpose.

  • The different types of domain name extensions
  • Other domain name extensions
  • Choosing the right extension in 3 essential points
  • Extensions to avoid
  • Top Level Domain
Read more
Are you a plumber? Do you want to rank plumber.com?
We get it.
But is it really a good idea? 4 or 5 years ago, there was no doubt, but what about now?
  • Domain name or brand : which one to choose?
  • OK, so you keep your domain name for your website, and just buy out the other domain name?
Read more

Cloaking is an optimization technique used by some SEOs and website owners, which involves displaying different content to bots and users. In other words, cloaking is a technique aimed at manipulating Google and other search engines’ crawlers.

  • The different cloaking techniques
  • White Hat cloaking: smart and tolerated techniques
  • The risks involved and Google's sanctions
Read more

It is essential to offer users a fluid, ergonomic and successful experience. The tree structure of your website also plays a key role in natural referencing.

  • What is a website tree structure?
  • The role of a website tree in SEO
  • The 7 steps to create a website tree structure
Read more

Internet users can use search engines for information, education, online shopping and transactions. Even though these applications offer so much data to users, many people are not aware of how they work.

  • What is the difference between a browser and a search engine?
  • What are the main operation steps in of a search engine?
  • How do search engines determine the relevance of results?
  • What is the most used search engine?
  • How many searches are carried out each day on search engines?
Read more

Negative SEO is one of the most formidable weapons used by some people to harm the ranking of results on search engines. It should be noted that the American giant Google does not hesitate to issue penalties to those who do not respect the procedures.

  • What is Negative SEO?
  • The main Negative SEO techniques
  • How to prevent and protect yourself from negative SEO attacks?
Read more

Further reading

The complete guide to Internal Meshing

The step-by-step method to build your semantic cocoons, your mesh and the optimal tree structure of your website.

Download for free

The new way to perform in SEO

Contact us
clients-logos-smartkeyword