Avoiding DDoS attacks caused by large HTTP request bodies by enforcing a hard limit in your web server

Django team issued a new security release today. This release addresses a DDoS attack against Django authentication framework which could happen if someone posts a large password.

In this post I will provide some details on how you can mitigate issues like that, by putting a hard limit on a request body size in your web server.

Limiting request body size in your web server

Issues like this can be mitigated pretty easily by limiting a maximum request body size in your web server config. In fact doing something like this is pretty much always a good idea, because:

  • Your typical Django / Ruby / PHP framework is usually not optimized for handling large amount of data by default. Allowing users to pass large amounts of data to it by default is almost always a bad idea.
  • If you allow users to upload files to your server such as images or avatars, this also helps you prevent large file upload DDoS attacks.
  • And there are many other application level DDoS attacks which can be mitigated or prevent by limiting the input size. And no, this not an excuse to write sloppy code which doesn’t do any checking. This is just an additional safeguard and security layer. Remember, security is all about layers.

I have been doing that for many years now and bellow you can find short configuration snippets which show how you can do that in Apache and nginx.

Limiting max request body size in Apache

In Apache you can do that by using LimitRequestBody directive which is set to zero (unlimited) by default.

I prefer to put this directives into my VirtualHost section. If this it not possible for you because you don’t have access to this section, but AllowOverride is enabled, you can achieve the same thing by putting this in the .htaccess file.

<VirtualHost *:8000>
    # other stuff

    # Small, safe default (1 MB)
    <Location />
        LimitRequestBody 1048576
    </Location>

    # 2 MB
    <Location /users/profile/edit/avatar>
        LimitRequestBody 2097152
    </Location>

    # 5 MB
    <Location /users/profile/edit/images>
        LimitRequestBody 5242880
    </Location>
</VirtualHost>

Limiting max request body size in Nginx

In nginx, same can be achieved by using client_max_body_size directive. Nginx is less problematic in this regards because it’s already set to 1 MB by default, but a lot of people allow of users to upload large images or use tools like phpMyAdmin and bump a global default value to some large number.

server {
    # other stuff

    client_max_body_size 1m;

    location /users/profile/edit/avatar {
        client_max_body_size 2m;

        # other stuff such as proxy_pass, etc.
    }

    location /users/profile/edit/images {
        client_max_body_size 5m;

        # other stuff such as proxy_pass, etc.
    }

Conclusion

As noted above, I believe limiting maximum request body size is always a good idea.

Besides that it doesn’t matter with what kind of software you use or work with, it’s always a good idea to set a (global) default value to some small, safe and reasonable value instead of doing it vice-versa which what a lot of people like to do.

If possible, it’s also a good idea to do that not just in your application, but also in a “lower level”. In this case this is a web server, but it could also be a load balancer, your operating system, kernel or similar (limits.conf is your friend!).

Some people might argue that putting a hard limiting in your web server and not your application can cause a bad user experience, because web server will return a default error page for 413 (request entity to large) error code.

This is a non-issue, because every server allows you to set a custom error page for each status code and you should do just that.