A thing we know when we write rules in the firewall or in .htaccess files, it’s to not block the search engines like Google, Bing, Yahoo etc.
Sometimes when we’re searching for scenarios to imagine from where the attack comes, we have to peel the logs. Then discovering this scenario, we can take our head in out hands and say “Do i have to block it? Why him?”
The GET requests
Until recent days, this scenario had not yet been found, and it is disclosed by chance, peeling logs, a security consultant (source) discovered that Google was trying to do some SQL injection attacks on the website of his client. He, of course, blocked the IP address thinking it was a false address, but then he realized that no, it is indeed the true address of the giant Google Bot.
This is actually the Google Bot and it actually also made this SQL injection attack via a GET request.
So Google is he hacked? The IP address has she been abused? Is it a clone bot created by a hacker? Not at all! Let’s look little closer to the IP:
It’s not lying, it is Google that …
What’s going on?
Google has no interest in launching these attacks on sites, against an attacker has. It will use the bots to work side attack without using server resources, without losing time to do it himself, without being seen, traced by its IP address.
In this scenario the bot was crawling the Site A, the pirate’s one. This site has a number of links, some point to site B, site of the victim. The hacker simply put links containing attempts to exploit the SQL injection vulnerability, then he count on Google bots to visit these links and therefore, do the dirty work for him! Difficult to say whether this has happened before, probably yes but who pays attention to that.
We can not say that Google is responsible for the attacks, he finally does his job crawling your site and visit the links. The person in charge is the person who set up the trap and not one that activates inadvertently or by ignorance.
How to protect your WordPress?
So what to do against it, how to avoid receiving these attacks from a site that gives you full confidence. I think we should not put a whitelist of sites on the first level. We must first verify that the request does not contain strange keywords like
For this, there is a BBQ plugin that blocks malicious requests considered. I invite you to test it.