Yesterday I went to a company suffering from periodic performance on their .Net website. Initially we could see from the logfiles that in one hour during the night 18.000+ hits were received by the webserver. The IT responsible stated that this number was way beyond normal traffic patterns. The question was asked, if these requests could be sent from a single server.
No problem. In a few minutes I created the following script for LogParser:
c:\program files\log parser 2.2\logparser
“SELECT TOP 10 COUNT(*), c-ip
INTO TopTenIps.txt
FROM ex070601.log
WHERE EXTRACT_EXTENSION(cs-uri-stem)=’%aspx%’
GROUP BY c-ip
ORDER BY count(*) DESC “
The output was the following:
COUNT(ALL *) c-ip
———— —————
100033 82.17.208.4
153 214.150.46.211
47 39.99.13.122
30 196.2.114.1
16 82.52.143.18
11 126.225.159.150
9 66.55.208.94
7 66.55.208.179
7 201.47.30.107
4 66.55.212.236
100K hits from the same IP is of course very extreme. They were able to relate the IP address to an inhouse search engine crawling the website at up to 17.000 hits per hour, or more than 3 hits per second. So be careful with webcrawlers ;-). If foreign IP’s are hammering your website, find a way to reject the requests before they spend to many resources on your server.
Leave a comment