4

In my django project, the value of DATA_UPLOAD_MAX_MEMORY_SIZE is 2.5MB. But I can send +10MB data base64 image to my server using curl.
How can I limit the requests and why that env does not work in my case?

mastisa
  • 1,875
  • 3
  • 21
  • 39
Mairon
  • 621
  • 8
  • 21

1 Answers1

4

Base on django docs about DATA_UPLOAD_MAX_MEMORY_SIZE:

The maximum size in bytes that a request body may be before a SuspiciousOperation (RequestDataTooBig) is raised. The check is done when accessing request.body or request.POST and is calculated against the total request size excluding any file upload data.

By using django rest framework serializer, You can add a validator to your serializer and check file size like bellow code:

from rest_framework import serializers

def file_validator(file):
    """
    file validator
    check max file size allowed for file
    :param file:
    :return:
    """
    max_file_size = 1024 * 1024 * 2  # 2MB
    if file.size > max_file_size:
        raise serializers.ValidationError(_('Max file size is {} and your file size is {}'.
                                          format(max_file_size, file.size)))


class FileSerializer(serializers.Serializer):
    file = serializers.FileField(validators=[file_validator])
mastisa
  • 1,875
  • 3
  • 21
  • 39
  • Is there any way to drop the packet at gunicorn? I used it to prevent gunicorn timeouts but it does not help and I still get the timeout when the request payload is too large. – Mairon Oct 13 '18 at 12:28
  • I think timeout is on nginx side and you can change max file size in nginx: https://stackoverflow.com/a/26717238/8258902 – mastisa Oct 13 '18 at 12:37
  • Yeah, but I was interested in handling it after nginx. BTW, I wrote a pre_request hook to check for it in gunicorn. – Mairon Oct 13 '18 at 12:39