SEO

Has your site fallen off the map without you knowing it?

Written by Roberto Mazzoni

Sucuri WordPress protection can block search engines from accessing your site altogether. This is what I found accidentally after activating this security protection on one of my websites. All crawling effort were stopped and only a small part of the home page was visible. None of the remaining several hundred pages were visible. They were actually individually accessible, but when Google or any other crawler came to the home page, nothing was there.

Sucuri is one of the best known security companies offering protection for WordPress and we went to them after a malware attack on a site we had hosted with Bluehost. We were in a hurry because Bluehost would just keep the site offline until we contracted a security company to clean it up. So we hired Sucuri and we paid for a full year protection plan. Sucuri team cleaned the site and changed its configuration so that any incoming traffic would go through their proxy.

Everything seemed fine. The site was back online and we were able to navigate all of its sections, by hand. We also updated it regularly, but it dropped from its ranking on Google. Eventually, a bit a of research with a free tool named Screaming Frog revealed the horrible truth. Our site had become completely invisible to Google and no indexing was being performed any more. But we didn’t get to that conclusion immediately. Initially we used another software to diagnose the issue, SEMrush and we were told that we had a problem with the robot.txt file. This particular file can block Google’s access altogether if not properly configured.

We checked it up and we read the robust Google’s documentation on the subject, but we could not find anything wrong. We were surprised since WordPress takes care of all configuration issues automatically. It is virtually impossible to have a wrong robot.txt file in WordPress unless you tamper with it and we hand’t.

Tracking the Issue to Sucuri WordPress Protection

We did get same ranking improvements in the meantime because we were linking directly to specific articles from videos we had published on Youtube. Yet the visibility of the site as a whole was definitely wrong. We continued our research with other tools like Open Site Explorer by MOZ. In this case the tool was able to locate a total of 5 pages which confirmed that there was a problem, yet we didn’t know what. Only using the full crawler of Screaming Frog and tweaking different configuration settings we found the issue. We had to actually remove Sucuri’s protection altogether in order to make the site accessible to Google again.

You might therefore find yourself in the same predicament without knowing it. We are not criticizing Sucuri for their actions: they wanted to ensure total protection from unwanted crawlers. When you are under attack, it can make sense because you don’t want to be blacklisted. After the attack we moved our hosting to Digital Ocean and we implemented our own security systems. Therefore we could remove Sucuri and regain Google visibility. Use the advice in this article as you see fit. Just be aware that you might have fallen off the grid because of some protection software that is over-protective. It is possible that Sucuri can configure its protection so to let Google through.  We didn’t check it out. Just be aware of what can happen and act accordingly.

Roberto Mazzoni

About the author

Roberto Mazzoni

Author, technology journalist, blogger and international entrepreneur.

Leave a Comment