6

This question is not about protecting against SQL injection attacks. That question has been answered many times on StackOverflow and I have implemented the techniques. This is about stopping the attempts.

Recently my site has been hit with huge numbers of injection attacks. Right now, I trap them and return a static page.

Here's what my URL looks like:

/products/product.php?id=1

This is what an attack looks like:

/products/product.php?id=-3000%27%20IN%20BOOLEAN%20MODE%29%20UNION%20ALL%20SELECT%2035%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C%27qopjq%27%7C%7C%27ijiJvkyBhO%27%7C%7C%27qhwnq%27%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35%2C35--%20

I know for sure that this isn’t just a bad link or fat-fingered typing so I don't want to send them to an overview page. I also don’t want to use any resources on my site delivering static pages.

I’m considering just letting the page die with die(). Is there anything wrong with this approach? Or is there an HTML return code that I can set with PHP that would be more appropriate?

Edit:

Based on a couple of comments below, I looked up how to return 'page not found'. This Stack Overflow answer by icktoofay suggests using a 404 and then the die(); - the bot thinks that there isn’t a page and might even go away, and no more resources are used to display a page not found message.

header("HTTP/1.0 404 Not Found");
die();
Community
  • 1
  • 1
JScarry
  • 1,507
  • 1
  • 13
  • 25
  • 1
    just use prepared statements – bksi Aug 15 '13 at 18:01
  • Returning a 403 error might be somewhat appropriate. – zebediah49 Aug 15 '13 at 18:01
  • Do you have an F5 load balancer before the server? – Colyn1337 Aug 15 '13 at 18:01
  • 403 Forbidden seems apt here. – Asad Saeeduddin Aug 15 '13 at 18:02
  • 3
    @bksi If you read the first line, you'd notice that that's not what the question is about – StephenTG Aug 15 '13 at 18:02
  • 3
    IMHO, this question is not programming related, it's server admin. – Alnitak Aug 15 '13 at 18:02
  • Alnitak answers before me – bksi Aug 15 '13 at 18:03
  • @Colyn1337 you do know there are other brands of load-balancer, don't you? – Alnitak Aug 15 '13 at 18:04
  • I think a 404 not found may be good.if the attack is from a script and if they are at least checking if the page exists, the script won't re attempt it – bansi Aug 15 '13 at 18:04
  • @Alnitak yes, but the F5 has iRules – Colyn1337 Aug 15 '13 at 18:04
  • If the OP can afford an F5, he surely can configure it. If he doesn't have an F5 and has problems like this, chances are he can't afford one. – Alnitak Aug 15 '13 at 18:05
  • @Alnitak, you're assuming everyone knows everything about all there is... If that were the case, stackoverflow wouldn't exist. – Colyn1337 Aug 15 '13 at 18:06
  • I tried looking up how to return page not found codes, but I’m not familiar with that part of web development and don’t know how to even form the question. Is that something I can do with PHP? Do you have a link to get me started? – JScarry Aug 15 '13 at 18:08
  • @Colyn1337 I'm not assuming anything. You, on the other hand, appear to be suggesting that an F5 load-balancer is the solution to the OP's problem. That's a very broad (and very expensive!) assumption. – Alnitak Aug 15 '13 at 18:12
  • @Colyn1337 You’re right. I normally get 3,000 hits a month so I don’t have any load to manage. I do use FailToBan to manage SSH login attempts but otherwise haven’t had an issue with these kinds of attacks. – JScarry Aug 15 '13 at 18:14
  • @JScarry do you have any hardware that can do packet inspection? PS this would be a great question for http://security.stackexchange.com/ – Colyn1337 Aug 15 '13 at 18:19
  • `fail2ban` can block the traffic after X tries – Daniel W. Aug 18 '14 at 13:00
  • Reopened, if you'd like to move your edit to an answer :) – Taryn East Aug 21 '14 at 01:58
  • Just ban IP for 1 week. It'll cost you nothing (well - just entry in the DB) but the size of the ip pool is limited and they will run out of them sooner or later. Another option, redirect to: header("Location: http://www.youhavebeenbanned.com/"); – Andrew Aug 21 '14 at 20:05

4 Answers4

6

Filtering out likely injection attempts is what mod_security is for.

It can take quite a bit of work to configure it to recognize legitimate requests for your app.

Another common method is to block IP addresses of malicious clients when you detect them.

Bill Karwin
  • 538,548
  • 86
  • 673
  • 828
  • Thanks. I'll look into it. The IP addresses change frequently, so that isn’t feasible. – JScarry Aug 15 '13 at 18:05
  • 1
    It's feasible. Identify a characteristic of these requests that's extremely unlikely or impossible for regular users to trigger, then ban the IPs immediately when they trip it. – tadman Aug 15 '13 at 18:14
  • I got an anonymous downvote. Downvoter, can you please leave a comment to tell me why you think this answer is not good? Perhaps I can improve it. – Bill Karwin Aug 16 '13 at 15:51
1

You can attempt to stop this traffic from reaching your server with hardware. Most devices that do packet inspection can be of use. I use an F5 for this purpose (among others). The F5 has a scripting language of its own called iRules which affords great control and customization.

Colyn1337
  • 1,655
  • 2
  • 18
  • 27
  • I’m using a virtual private server so I don’t think that’s an option, but I’ll look into what they offer. – JScarry Aug 15 '13 at 18:25
0

The post has been unblocked, so I thought I’d share what I’ve been doing to reduce attacks from the same ip address. I still get a half dozen a day, but they usually only try once or twice from each ip address.

Note: In order to return the 404 error message, all of this must come before any HTML is sent. I’m using PHP and redirect all errors to an error file.

<?php
require_once('mysql_database.inc');

// I’m using a database, so mysql_real_escape_string works.
// I don’t use any special characters in my productID, but injection attacks do. This helps trap them.
$productID = htmlspecialchars( (isset($_GET['id']) ? mysql_real_escape_string($_GET['id']) : '55') );

// Product IDs are all numeric, so it’s an invalid request if it isn’t a number.
if ( !is_numeric($productID) ) {
    $url = $_SERVER['REQUEST_URI'];  // Track which page is under attack.
    $ref = $_SERVER['HTTP_REFERER']; // I display the referrer just in case I have a bad link on one of my pages
    $ip  = $_SERVER['REMOTE_ADDR'];  // See if they are comng from the same place each time

    // Strip spaces just in case they typed the URL and have an extra space in it
    $productID=preg_replace('/[\s]+/','',$productID);
    if ( !is_numeric($productID) ) {
        error_log("Still a long string in products.php after replacement: URL is $url and IP is $ip & ref is $ref");
        header("HTTP/1.0 404 Not Found");
        die();
    }
}

I also have lots of pages where I display different content depending on the category that is picked. In these cases I have a series of if statements, like this if ($cat == 'Speech') { } There is no database lookup, so no chance of SQL injection, but I still want to stop the attacks and not waste bandwidth displaying a default page to a bot. Usually the category is a short word so I modify the is_numeric conditional above to check for string length e.g. if ( strlen($cat) > 10 ) Since most to the attempts have more than 10 characters in them, it works quite well.

JScarry
  • 1,507
  • 1
  • 13
  • 25
-1

A very good Question +1 from me and answer is not simple.

PHP does not provide way to maintain data for different pages and different sessions, so you can't limit access by IP address unless you store access details somewhere.

If you don't want to use a database connection for this, you can of course use the filesystem. I'm sure you already know how to do this, but you can see an example here:

DL's Script Archives
http://www.digi-dl.com/
(click on "HomeGrown PHP Scripts", then on "IP/networking", then
on "View Source" for the "IP Blocker with Time Limit" section)

The best option used to be "mod_throttle". Using that, you could restrict each IP address to one access per five seconds by adding this directive to your Apache config file:

<IfModule mod_throttle.c>
    ThrottlePolicy Request 1 5
</IfModule>

But there's some bad news. The author of mod_throttle has abandoned the product:

"Snert's Apache modules currently CLOSED to the public 
  until further notice. Questions as to why or requests
  for archives are ignored."

Another apache module, mod_limitipconn, is used more often nowadays. It doesn't let you make arbitrary restrictions (such as "no more than ten requests in each fifteen seconds"). All you can do is to limit each IP address to a certain number of concurrent connections. Many webmasters seem to be advocating that as a good way to fight bot spam, but it does seem less flexible than mod_throttle.

You need different versions of mod_limitipconn depending which version of Apache you're running:

mod_limitipconn.c - for Apache 1.3
http://dominia.org/djao/limitipconn.html

mod_limitipconn.c - Apache 2.0 port
http://dominia.org/djao/limitipconn2.html

Finally, if your Apache server is hosted on a Linux machine, there's a solution you can use which doesn't involve recompiling the kernel. Instead, it uses the "iptables" firewall rules. This method is rather elegant, and is flexible enough to impose constraints such as "no more than three connections from this IP in one minute". Here's how it's done:

Linux Noob forums - SSH Rate Limit per IP
http://www.linux-noob.com/forums/index.php?showtopic=1829

I realize that none of these options will be ideal, but they illustrate what is possible. Perhaps using a local database will end up being best after all? In any case, bear in mind that simply limiting the rate of requests, or limiting the bandwidth, doesn't solve the problem of bots. They may take longer, but they'll eventually drain just as many resources as they would if they were not slowed down. It's necessary to actually reject their HTTP requests, not simply delay them or spread them out.

Good luck in the escalating battle between content and spam!

Vineet1982
  • 7,730
  • 4
  • 32
  • 67