Tuesday, 16 December 2014

Mod_security setup against DDoS Attack

Mod_security setup against DDoS Attack
Version: mod_security­2.6.7­2.el5

Apache version: 2.2.3

We can protect our website from hacker whenever they send the huge request using
mod_security module in Apache webserver. This module protection method is blocking IP
address if we receiving huge hits within particular interval.

If need, we can allow huge no. of request for particular IP address

If need, we can allow search engine websites using User­Agent method

Log file path: /var/log/httpd/modsec_audit.log & /var/log/httpd/modsec_debug.log

ModSecurity 2.x allows rules to be placed in one of the following five phases of the Apache
request cycle. We are using phase1 in our configuration:
Phase1: Request headers (REQUEST_HEADERS)
Phase2: Request body (REQUEST_BODY)
Phase3: Response headers (RESPONSE_HEADERS)
Phase4: Response body (RESPONSE_BODY)
Phase5: Logging (LOGGING)

For example:
Using below configuration, we can blocks users who send more than 20 requests in a 10 second
period from same IP. They will be blocked for 30seconds unless this has be a frequent
occurrence. If they were blocked more than five times within five minutes they will be blocked
for five minutes.

Add below lines in mod_security.conf file(Path: /etc/httpd/conf.d):
SecRule REMOTE_ADDR "^$" "phase:1,t:none,allow,nolog,ctl:ruleEngine=off"
SecRule REMOTE_ADDR "^$" "phase:1,t:none,allow,nolog,ctl:ruleEngine=off"
SecRule REQUEST_HEADERS:User­Agent "Googlebot"
SecRule REQUEST_HEADERS:User­Agent "Yahoo! Slurp"
SecRule REQUEST_BASENAME "!(css|doc|flv|gif|ico|jpg|js|png|swf|gz|pdf)$" "phase:1,nolog,pass,initcol:ip=%{REMOTE_ADDR},setvar:ip.requests=+1"
SecRule ip:requests "@le 2" "phase:1,nolog,expirevar:ip.requests=10"
SecRule ip:requests "@ge 20" "phase:1,pass,nolog,setvar:ip.block=1,expirevar:ip.block=30,setvar:ip.blocks=+1,setvar:ip.requests=0,expirevar:i
SecRule ip:blocks "@ge 5" "phase:1,deny,log,logdata:'req/sec: %{ip.requests}, blocks: %
SecRule ip:block "@eq 1" "phase:1,deny,log,logdata:'req/sec: %{ip.requests}, blocks: %{ip.blocks}',status:403"
ErrorDocument 403 "<html><body><h2>Too many requests.</h2></body></html>"Rule Details:

Rule 1&2: Allowing huge no. of request for particular IP & local network. Here, you can
change you IP address instead of
Rule 3&4: We are allowing huge no. of request for Google & Yahoo search engine websites
Rule 5: Ignoring media files, count requests made in past 10 seconds.
Rule 6: We want the var to expire and leave it alone. If we combine this with the
increments in the rule above, the timer never expires unless
there are absolutely no requests for 10 seconds.
Rule 7: if there were more than 20 requests in 10 seconds for this IP set var block to 1 (expires
in 30 seconds) and increase var blocks by one (expires in 5 minutes)
Rule 8: If user was blocked more than 5 times (var blocks>5), log and return http 403.
Rule 9: if user is blocked (var block=1), log and return http 403
Rule 10: Error message

More details:

Since we should allow huge no. request for Search engine website like: Googlebot, Yahoo and etc.,
So we tried to allow these websites using domain based(eg., googlebot.com) but not able to do using
this method. But we can meet this requirement using User­agent.
When we check our website(LS) log file, we found below details:

Yahoo: ­ ­ [26/Nov/2014:23:30:18 +0000] "GET /blog/secret­to­healthy­eating/ HTTP/1.1" 200
8490 515039 "­" "Mozilla/5.0 (compatible; Yahoo! Slurp;

Google: ­ ­ [25/Nov/2014:19:21:33 +0000] "GET / HTTP/1.1" 200 29633 421334 "­"
"Googlebot/2.X (+http://www.googlebot.com/bot.html)" ­ ­ [26/Nov/2014:00:13:19 +0000] "GET / HTTP/1.1" 200 29637 158050 "­" "Mozilla/5.0
(compatible; Googlebot/2.1 +http://www.googlebot.com/bot.html)"

Here whenever they send the request to our webiste, they are using “Yahoo! Slurp & Googlebot”
User­agent. So we can allow them using below rule:Rule: SecRule REQUEST_HEADERS:User­Agent "ApacheBench" phase:1,nolog,allow,ctl:ruleEngine=off
Note: we tested using rule “SecRule REQUEST_HEADERS:User­Agent "ApacheBench"
phase:1,nolog,allow,ctl:ruleEngine=off”. If we send no. of request using ab command, at that time our
requests are allowed based on the User­agent.

Allowing IP:
We tested it in local network using IP address If we send the huge request from this IP, all of
the requests are allowed and got the response code 200.

Allowing Media file :
We tested it too in local network. Created the gz format file and placed it in particular domain(eg:
http://abc.com/test.gz). When we send the no. request to this URL at that time also we got 200 response

Log details :
If anyone send the no. request from Particular IP, that ip will be blocked and details stored in
modsec_audit.log & site access log in below format:

In modsec_audit.log:
[15/Dec/2014:05:19:24 +0530] xpMAYwoBBUEAACkxCdcAAAAF 58473 80
GET / HTTP/1.1

Host: gailoadtest.com

User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:24.0) Gecko/20100101 Firefox/24.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept­Language: en­US,en;q=0.5
Accept­Encoding: gzip, deflate
Connection: keep­alive
If­Modified­Since: Wed, 26 Nov 2014 09:21:57 GMT
If­None­Match: "c004d6­6b­508bf8f269740"
Cache­Control: max­age=0
HTTP/1.1 403 Forbidden
Accept­Ranges: bytes
Content­Length: 4958
Connection: close
Content­Type: text/html; charset=UTF­8

Message: Access denied with code 403 (phase 1). Operator EQ matched 1 at IP:block. [file
"/etc/httpd/conf.d/mod_security.conf"] [line "104"] [data "req/sec: 0, blocks: 1"]
Action: Intercepted (phase 1)
Stopwatch: 1418600964620387 516 (­ ­ ­)
Stopwatch2: 1418600964620387 516; combined=157, p1=85, p2=0, p3=0, p4=0, p5=37, sr=38, sw=35, l=0, gc=0
Producer: ModSecurity for Apache/2.6.7 (http://www.modsecurity.org/).

Server: Apache/2.2.3 (CentOS)
­­d147a861­Z­­In access log: ­ ­ [02/Dec/2014:17:19:00 +0530] "GET /index.html HTTP/1.0" 403 69 "­" "ApacheBench/2.3"

Ref URL:
https://www.atomicorp.com/wiki/index.php/Modsecurity_audit_log → Audit log details
NOTE: Above testing process completed in machine)

No comments:

Post a Comment