PLEASE NOTE:
This article currently applies ONLY to our Linux Shared hosting servers.
Our Cloud VM/VPS and Dedicated Servers are not affected by this.
We are blocking some web "Bots" due to their often aggressive crawler techniques which impact on both bandwidth and server resources; see the following for "Bots" that are currently blocked (as of writing):
babbar.tech
Barkrowler
Bytespider
DotBot
GPTbot
MJ12bot
Petalbot
Petalsearch
Yandex
Amazonbot/0.1
facebookexternalhit/1.1
meta-externalagent/1.1
If you must allow one or more of these, you can do so by adding an entry for each website's .htaccess file.
# Whitelist Web Bots
SetEnvIfNoCase User-Agent facebookexternalhit/1.1 Whitelist
SetEnvIfNoCase User-Agent meta-externalagent/1.1 Whitelist
We have also blocked requests that do not set a User Agent i.e. it is blank; to allow those requests you would adding the following entry for each website's .htaccess file:
SetEnvIfNoCase User-Agent ^-?$ Whitelist
However, it is important to note that:
- The bots listed can cause negative impact of server resources - and effect website operation and uptime, as such it is advised to only allow access for blocked bots if absolutely necessary.
- If we find that, after the bots are allowed access, that they cause aggressive resource use and/or are causing instability on the server we may block them again; if that is the case then you may need to move the website to a Cloud VPS/Dedicated server.
Where Do I Put This Code?
Please refer to the following article for more information on where to put the code required:
Protection For Common Login Areas
Additional Support
As always, should you require more assistance in relation to this issue, please contact our Helpdesk explaining your issue - and what you have done to this point, and we will be happy to assist you further.
Please see our full contact information, on our main website: Contact Us.
Alternatively, email help@blacknight.com for assistance.
Comments
0 comments
Article is closed for comments.