Cloaking is an optimization technique used by some SEOs and website owners, which involves displaying different content to bots and users. This strategy aims to climb faster and more effectively in the top organic search results (blue links). In other words, cloaking is a technique aimed at manipulating Google and other search engines’ crawlers. Therefore, it’s a cover-up tactic prohibited by all search engines, and considered as a Black Hatmethod. However, many specialists still continue to use it. What should you think about it?

cloaking-black-hat

Contents:

The different cloaking techniques

There are several techniques for cloaking, some more difficult to spot than others. 

Cloaking from the user agent 

Cloaking involving the user agent is widespread, but it’s also one of the riskiest. It's just a matter of identifying the visitor's user agent and tailoring the content accordingly. For example, if the system identifies that the visitor is a human and not a robot, the displayed page will be designed for humans (little text, nice images, etc.). If the user agent is a robot, the displayed page will be ultra-optimized with a lot of text. But why make it complicated? Simply because sometimes SEO and UX don't mix well together. Ranking a page requires content and quality, which requires a certain expertise in web design in order not to offer a stodgy page. In other words, designing a page that meets robots’ requirements AND users takes time. It’s faster to split the page into 2 versions: a neat visual version and a very raw one with optimized text. However, this technique tends to disappear as it’s easily detectable by robots, in particular through CSS.

Cloaking based on the IP address

Each user accessing a website has an IP address based on their location and internet service. It's just a series of numbers which we can see in the logs. IP cloaking is the process of detecting an IP address and tailoring the content based on the information it reveals. This technique is generally used to adapt the content according to the user's location. It can also be used to detect a search engine’s crawler to offer a very well optimized page. Thus, the bot will “think” that the page is rich and relevant, and will potentially put it in the top search results.

Cloaking via Javascript, Flash or Dhtml

This technique takes advantage of the various browser settings. When the user's browser is compatible with JS, then the user receives a different version than a user whose browser has disabled JavaScript. Nowadays, this method is somewhat obsolete because the question of enabling or disabling JS is not so much an issue. This is especially true with Flash.

Old school cloaking using invisible text

This is the “dirtiest” and riskiest technique. It simply consists of adding invisible text (for example written in white on a white background), and spreading it all over the page. If it worked very well 10 to 15 years ago, today it’s a bygone method that must be avoided. Google knows very well how to spot these hidden texts. Unless there is a specific directive in robots.txt (to be avoided), CSS is accessible to bots. Therefore, they will know very well how to detect that tricky method!

Cloaking via HTTP_Referer and HTTP Accept-language

With this method, the requester's HTTP_REFERER header is checked and based on this, a different version of the page can be displayed. The second technique is to check the user's HTTP Accept-Language header and based on the match result, a specific version of the website is displayed.

White Hat cloaking: smart and tolerated techniques

Any kind of cloaking is against Google's rules. However, with the expansion of mobile, mega menus and faceted filters, many websites were quickly faced with the difficulty of mixing SEO and UX. Therefore, certain techniques have emerged and are currently "tolerated" by Google, as long as they are intended to improve the user experience. Among the most common is geographic targeting. It’s meant to perform automatic geo-targeting in order to help determine where a visitor is coming from and which page they want to see. Robots are redirected to a non-geo-targeted page. This makes sense as robots are “everywhere and nowhere”. Link obfuscation is also an increasingly common practice considered to be White Hat cloaking (or Gray Hat for the more cautious). This consists of "hiding" some internal links from robots in order to optimize the crawl budget. Obfuscation is particularly relevant for websites that feature faceted navigation that is often the source of hundreds or even thousands of pages that "consume" crawl, without generating SEO visits. So, hiding these internal links from robots makes it possible to focus the crawl on really strategic pages, while leaving filter suggestions to users. In other words, we help Google find the most relevant pages, and that's all it wants, right?

The risks involved and Google's sanctions

Black Hat cloaking can lead to severe algorithmic or manual penalties from Google but also from other search engines. They can partially affect a website (the one affected by cloaking) or affect the entire website. In other words, if you play with fire, you can burn the whole house! Therefore, nowadays, Black Hat cloaking is reserved for experts and/or “secondary” websites that you use for linking for example. Under no circumstances you should engage in cloaking on your “money” website! If some techniques still manage to deceive Google’s vigilance today, we do not know what tomorrow will be made of.

Conclusion

When it comes to SEO, there are two schools of thought: Black Hat followers and “organic” SEOs. Some do not hesitate to take all the risks while others strictly follow Google's guidelines. In any case, if you do not have advanced technical expertise, we do not recommend that you embark on such projects. Indeed, although very tempting, the various cloaking techniques currently are more risky than beneficial. You can still use White and Gray Hat techniques, which are less dangerous and quite effective. However, this will require an active watch to make sure that Google does not toughen the rules in the meantime! 

class="img-responsive
   Article written by Louis Chevant

Further reading

The complete guide to Internal Meshing

The step-by-step method to build your semantic cocoons, your mesh and the optimal tree structure of your website.