There is nothing worse for an SEO or an advertiser than realizing that their website has received a manual penalty. Even just writing this gives us the chills! In this article, we’ll take a detailed look at the different types of manual actions, and the different solutions to deal with them.


What is a manual penalty, and how does it differ from an algorithmic penalty?

With manual actions, Google can demote or delete pages or websites as a whole. They are not related to changes in algorithms such as Penguin, Panda or Hummingbird. It’s a member of the Google team who manually “punishes” websites for breaking the rules.


Unlike algorithmic penalties, manual penalties mean that a website shows enough bad behaviours that it has to be handled by a real person. So, in general, this is something you should worry about. The advantage (if we can say so) is that you will be clearly notified by Google via messaging on the Search Console.

How to know the impact of a manual penalty on a website?

Some manual actions may not have a significant impact on a website’s organic traffic. Indeed, they can only involve a handful of pages which you might consider as not strategic. To measure this risk, start by including an annotation in Google Analytics on the date that Google applied the manual action, and then monitor the organic search traffic. If you notice a significant decrease, and you have excluded all the other possible reasons (seasonality, trends…), then you can start more in-depth investigations to determine the part of your website that is impacted.


However, if you do not see fluctuations in organic traffic, there is no reason to be worried. Errors will need to be fixed, but you will have time to think about that.

How to avoid a manual penalty on a website?

To guard against possible manual actions, you must keep in mind the reasons why you could be penalized.

Unnatural links toward your site

This is a common situation. There are two types of manual actions relating to unnatural links. The first concerns unnatural links that bots have found but which are beyond your control (spam). Google knows this kind of links well, and they do not impact a website’s overall ranking. The second type of action concerns practices aimed at manipulating algorithms through websites’ networks (PBNs) or link buying. In other words, Google would know how to differentiate the links for which you are not responsible from the links that you created yourself.

If the Search Console message is about the second type of link, you may go through a long and difficult road to get out of it. You'll have to show Google that you made efforts to remove as many unnatural links as possible, and that isn't always easy.

Once you've removed (or attempted to remove) links that are deemed too artificial, you can initiate a review request from the Search Console. If Google teams feel that you've done a good job of removing unnatural links, they can waive the penalty. Otherwise, the request for reconsideration will be denied, and you will have to carry on and make more efforts.

Note that even if the penalty is lifted, you may not fully recover all your organic traffic. Indeed! Since these links helped position the website well on specific keywords, removing them can only have a negative effect on your rankings. In that case, you’ll have to start from scratch, and work (properly this time!) on creating links which follow the rules dictated by Google.

Unnatural links from your site

Google does not only penalize websites that receive artificial links. It applies the same sanction to websites that link to other websites with artificial or purchased links (the so-called outgoing links). Here, the manual penalty can affect all the websites that are part of the "scheme".

In this case, once again there is nothing else to do but remove the paid or traded links. If you want to keep them anyway, you can also add a Nofollow attribute to them, which tells Google that you are not using these links to manipulate its algorithms. Depending on the number of links, this task can be very time consuming!

It may happen that you do not find any outgoing link in the content. In that case, it’s usually a Dofollow link that is embedded in the comments or in the forum associated with the website. These are sometimes the ones that Google has identified as unnatural outbound links.

Hacked sites

No one is safe from hacking, and even if it’s not your fault, Google will take manual action on the website as soon as it detects suspicious code. To lift this penalty, you must remove all the snippets and malware that have infected the website as quickly as possible. But breaking down a hack is not an easy task for everyone. You can use specialized services like Sitelock or Sucuri. Once the website is free from the unpleasant software, you will need to make sure that it cannot happen again by making the website as safe as possible.

Low quality content websites

If Google considers the content to be of no value, it may apply a manual penalty. This is particularly the case for automatically generated content, pages with only affiliate product feeds, paraphrased content, etc.

If the website has any of these types of content, you will need to remove or enrich it, and make each page unique. Once you've optimized the content, you can submit a request for reconsideration. For those who do not have the time to create quality content, the best is to hire an external provider. Indeed, even if it represents an investment, it impacts the website’s lifespan and success a lot. Therefore, you will quickly make it profitable!

Spam websites and user-generated spam

Manual spamming is unforgiving. It concerns in particular botched content, automated gibberish (misused content spinning). The only way to counter this manual action is to clean up all pages and links that are considered as spam by Google.

If you’ve got a blog, a forum, social networks, or a website with public profiles, you may also receive a manual action for user-generated spam. Yes, that's not fair... This user-generated spam can be found in blog comments, forum posts, and in profiles created only for spamming. That’s why it’s essential to install systems like Captchas, in order to differentiate real users from spambots.

Cloaking and misleading redirects

Cloaking is the act of showing different content to bots and users. This black hat method has been used for a long time to manipulate algorithms, as well as “misleading” redirects, which lead internet users to pages they did not intend to visit. Cloaking and misleading redirects can sometimes be the result of hacking. But some use it on purpose to improve their position.

If there is a manual penalty, determine the hack’s nature and get rid of it. However, if these techniques were voluntary, you will have no trouble going back to a clean code because you will already know the pages which are concerned.

Hidden text filling

We can wonder who still uses this “old school” SEO method. Obviously, if Google is still talking about it, it’s because some people are still having fun with it. Once again, it can also come from a hack. So, if you are "innocent", it’s better to ask for a specialist to check the website.


The reasons behind a manual penalty are diverse, and do not always question the website editor’s good faith. Manual action can alert you to a danger on the website (hacking) or sanction an obvious violation of Google's rules. In any case, it’s better to take it seriously, and to quickly rectify the situation to put the website back on the right track. Be careful, once the penalty is lifted, do not think that you are totally out of the woods. The Panda, Penguin and Hummingbird algorithms are constantly on the lookout, and are as capable as a Google staff member of downgrading a website.

   Article written by Louis Chevant

Further reading

The complete guide to Internal Meshing

The step-by-step method to build your semantic cocoons, your mesh and the optimal tree structure of your website.