2

I have to deploy a python project on AWS Lambda function. When I create its zip package it occupies a memory of around 80 MB (Lambda allows upto 50 MB). Also I cannot upload it to s3 because the memory size of the uncompressed package is around 284 MB (S3 allows upto 250 MB). Any idea how to tackle this problem or Is there any alternative for it?

2 Answers2

2

The ZIP file is rather not an issue, as you would split your dependencies into lambda layers. Each layer can have 50 MB in zip, so you could have 2 layers.

The issue is that total unpacked size of your function + layers still must be less then 250 MB. There is no golden solution for this. You have to loose 34 MB from your 284 MB. You can go over content of your dependencies and start manually removing any documentation files, text files or anything that is non-essential. If the dependencies involve compiled shared objects you can use tools such as strip to reduce their size.

Basically, its a try and see approach what can be removed from your dependencies and what not. This can be time-consuming and troublesome process. Or you may re-factor your application to get rid some of dependencies as well.

Is there any alternative for it?

The alternative is not to use lambda function. The closest alternative is probably an ECS Fargate service.

Marcin
  • 215,873
  • 14
  • 235
  • 294
  • Thanks @Marcin for your help. The initial size of the project was around 310 MB which I reduced to 280 MB. After some more workaround I have now reduced it to 238 MB. Can you please describe in some steps how can I deploy this now using the concept of layers or using s3? I haven't done this before so it's a bit tricky for me. – Muhammad Arsalan Hassan Oct 09 '20 at 06:49
  • @MuhammadArsalanHassan Glad to hear. But its case-specific, there is no golden recipe of dividing projects into layers, it depends on how your project is organized. You have to go through aws docs for [layers](https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html), understand how they work and take it from there. – Marcin Oct 09 '20 at 06:57
  • I am using AWS Lambda Layers for using the libraries and it's working fine but there is a problem. I am using matplotlib library and in turn it uses numpy. when I test my lambda function it gives the following error: No module named 'numpy.core._multiarray_umath' Any idea what could be the possible issue and how to resolve it? – Muhammad Arsalan Hassan Oct 12 '20 at 10:16
  • @MuhammadArsalanHassan AWS provides library with scipy and numpy. You can try using it. – Marcin Oct 12 '20 at 10:34
  • I tried. They don't! I have to use it via layers. Any idea what could be the reason for the error? – Muhammad Arsalan Hassan Oct 12 '20 at 10:40
  • @MuhammadArsalanHassan Sadly not. You can make new issue about this problem with relevant details. – Marcin Oct 12 '20 at 10:46
2

Unfortunately, you can't upload a deployment to Lambda that has an uncompressed size of 250MB even when you use Layers (Read Note). There's a workaround to this, but it will impact your Lambda performance a lot.

  • Write a driver function in Lambda for your code.
  • This driver function will download your code from S3 into ephemeral Lambda Storage (/tmp/) and then run it.

Eventual solution is to use ECS Fargate (Serverless) or AWS Batch.

Adding a code sample for Lambda based solution:

import boto3
import os 
import subprocess
import sys

sys.path.insert(1, '/tmp')

def download_code_from_s3(bucketName, file_path):
    s3_resource = boto3.resource('s3')
    bucket = s3_resource.Bucket(bucketName) 
    bucket.download_file(file_path, os.path.join('/tmp/', file_path))

def lambda_handler(event, context):
    #Download zip file from S3 to /tmp/
    download_code_from_s3('bucket_name', 'code.zip')
    subprocess.run(['unzip', '/tmp/code.zip', '-d', '/tmp/'])
    
    #import and run the actual handler, while code.py is the entry point of the project
    from code import my_handler
    return my_handler(event, context)

Above code is deployed to Lambda function. Its responsibility is to download the zip file, unzip it, import the entry point and then run it. Unzipped size of contents must not exceed 512 MB.

amsh
  • 3,097
  • 2
  • 12
  • 26