Protective measures for a server are very important and there are several ways to protect your websites and apps from malicious bots. We’ll be looking at different bots and how they operate, and how you can use Plesk’s security measures to secure Nginx against malicious bots.
Malicious Bot Types
There are Bots that scan API keys on Git (Scanbots) and bots that download web pages. But even worse, you’ll find hackers using bots as a group of hijacked computers to take down websites (botnets). Or even send out innumerable spam emails (Spambots). Let’s take a deeper look at the latter two.
Bots For Spamming Emails
Spambots are special programs that crawl the internet for email addresses posted in forums, discussions boards, comments and websites. Spam generally means unwanted and unwarranted emails. They usually look for ‘mailto’ expressions (HTML used to display email IDs online), with a format such as the one below.
<ahref=“mailto:[email protected][email protected], [email protected],[email protected]&subject=Web%20News“>Email Us
Apart from mailto, others have resorted to using words, just to make it difficult for Spambots to crawl email addresses. For instance, instead of ‘‘[email protected]’’, others prefer to use this format rather: support[at]abz[dot].com on the web. However, spam programs identify these different formats and affect users. Costing time, money and productivity.
Bots For Hijacking Computers
Malicious botnets are a network of infected computers with malicious software controlled as a group by hackers to perform distributed denial of service attacks (DDOS). Botnet makes a way for malware to enter networks and control them.
Let’s look at how attackers use botnet hijack computers by studying a click-fraud botnet which made a profit for its creators through Google search program.
Paco Redirector is a botnet trojan which affected search engines, such as Google and Bing. Here’s how.
- First, it infects users’ computers when they download and install fake versions of popular software
- Afterward, Paco changes browser’s local registry keys to include two entries to ensure malware starts at boot time.
- Finally, the malware implements a proxy configuration file that captures traffic and routes it through attackers command and controlled server.
How to Secure Nginx Server against malicious Bots
Due to the fact that most websites run on an Nginx server, we need to know how to secure Nginx against malicious bots. We can protect the resources running on Nginx servers by using Plesk extensions and Fail2ban.
1. Using SpamExperts Email Security Extension
SpamExperts specifically protects a hosting environment from threats like spam and viruses. It comes with an incoming filter, which separates valid emails from unsolicited ones. And also an outgoing filter, which prevents your IP address from being blacklisted since spam can be sent from your compromised account within your web infrastructure.
2. Using DDOS Deflate Interface Extension
Hackers often use malicious bots to automatically brute-force authentication. So, you can use DDOS Deflate Interface to mitigate DDOS attacks by blocking IP addresses which exceed the configured threshold.
3. Using Fail2ban to Block Internet Bots
Fail2ban is a prevention software that protects servers like Nginx from bot attacks. You can install Fail2ban software by using the following command:
apt-get install fail2ban
Ubuntu users can make use of this command to install Fail2ban whilst Fedora and CentOS users can use the command below:
yum install fail2ban
Afterwards use the following command to create a second copy of Fail2ban local configuration file:
cp /etc/fail2ban/jail.conf /etc/fail2ban/local.conf
Below is a screenshot of the Fail2ban jail configuration file:
Search for the maxretry parameter and set it to 5. Maxretry is the parameter used to set the limit for the number of retry by a host. If the host exceeds this limit, the host is banned.
Apart from the maxretry parameter in the configuration file, there are other parameters such as Ingoreip which is used to set the list of IP addresses which will not be banned.
Then execute the following commands to run Fail2ban package on the server:
sudo systemctl enable fail2ban
sudo systemctl start fail2ban
Now let ‘s go ahead to configure Fail2ban to monitor nginx server logs.
Because these hackers use bots to perform brute-force, we can create a specific jail for login attempt by adding the following content to the jail.conf file under [nginx-http-auth]
enable = true
filter = nginx-auth
action = iptables-multiport[name=NoAuthFailures,port="http,https"]
logpath = /var/log/nginx*/*error*.log
bantime = 600
maxretry = 6[nginx-login]
enabled = true
filter = nginx-login
action = iptables-multiport[name=NoLoginFailures, port="http,https"]
logpath = /var/log/nginx*/*access*.log
bantime = 600
maxretry = 6
Finally you can create filter for the [nginx-http-auth] by navigating to the following path:
The screenshot below shows all files inside the filter.d directory
Open the file nginx-http-auth.conf and add the following content below failregex specification.
^ \[error\] \d+#\d+: \*\d+ no user/password was provided for | authentication, client: <HOST>, server: \S+, request: "\S+ \S+ HTTP/\d+\.\d+", host: "\S+"\s*$
Save and close the nginx-auth.conf. You can now activate your nginx jail by using the following command:
sudo service fail2ban restart
These solutions may not be the only ways to stop bots from attacking your Nginx server. However, you can rely on these methods to avoid the negative effects of malicious bots. Get in touch with one of our Plesk experts if you need further assistance regarding a bot attack.
How useful and straightforward was this guide? Any issues? Let us know in the comments below.
How to for server Apache?
I use share hosting .htaccess!
Hi, take a look at some of our tips and recommendations for server security here.
On Apache we can use robots.txt to manage, allow/deny specific robots.
Can we do the same on nginx? I couldn’t find similar article about it here.
Hi Ian! Yes, you can do the same on Nginx because robots.txt is not dependent on any web server. It’s a standalone text file made for crawlers and it can be used under both web servers (Apache or Nginx). Hope this helps!