5

When I create an AWS Lambda Layer, all the contents / modules of my zip file go to /opt/ when the AWS Lambda executes. This easily becomes cumbersome and frustrating because I have to use absolute imports on all my lambdas. Example:

import json
import os
import importlib.util
spec = importlib.util.spec_from_file_location("dynamodb_layer.customer", "/opt/dynamodb_layer/customer.py")
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)

def fetch(event, context):

    CustomerManager = module.CustomerManager
    customer_manager = CustomerManager()

    body = customer_manager.list_customers(event["queryStringParameters"]["acquirer"])

    response = {
        "statusCode": 200,
        "headers": {
            "Access-Control-Allow-Origin": "*"
        },
        "body": json.dumps(body)
    }

    return response

So I was wondering, is it possible to add these /opt/paths to the PATH environment variable by beforehand through serverless.yml? In that way, I could just from dynamodb_layer.customer import CustomerManager, instead of that freakish ugliness.

MikeW
  • 5,504
  • 1
  • 34
  • 29
Ericson Willians
  • 7,606
  • 11
  • 63
  • 114

4 Answers4

1

I've Lambda layer for Python3.6 runtime. My my_package.zip structure is:

my_package.zip
 - python
   - lib
     - python3.6
       - site-packages
         - customer

All dependencies are in build folder in project root: e.g. build/python/lib/python3.6/site-packages/customer

Relevant section of my serverless.yml

layers:
  my_package:
    path: build             
    compatibleRuntimes:     
      - python3.6

In my Lambda I import my package like I would do any other package: import customer

A.Khan
  • 3,826
  • 21
  • 25
  • Do you know if there is a way to set the path in serverless? I have a bunch of files that I don't want to have to zip up in a specific directory structure, but I'd like them to end up in `/opt/python/lib/python3.6/site-packages/shared`. Is there a way to do this without organizing them in that same directory structure in git? – Craig Feb 28 '19 at 14:40
  • @Craig Your build script can manipulate the directories in any way it sees fit, before packaging them into a zip file. This is typically called 'staging'. In this case, your build would create a directory called 'python' then copy required files and package-dies into it, before creating the zip and uploading it as a layer. See my answer for example ! – MikeW Dec 03 '20 at 17:06
0

Have you tried setting your PYTHONPATH env var? https://stackoverflow.com/a/5944201/6529424

Have you tried adding to sys.path? https://stackoverflow.com/a/12257807/6529424

eagle33322
  • 228
  • 2
  • 11
  • These are not the idiomatic ways to add import-able modules and packages, using Lambda Layers. See below. – MikeW Dec 03 '20 at 17:05
0

In the zip archive, the module needs to be placed the in a python subdirectory so that when it is extracted as a layer in Lambda, it is located in /opt/python. That way you'll be able to directly import your module without the need for importlib.

It's documented here or see this detailed blogpost from an AWS dev evangelist for more.

Milan Cermak
  • 7,476
  • 3
  • 44
  • 59
  • The links you provide did not seem explicit enough for my need to use my own package(s) - hence I have written things up a little more directly in my own answer below, – MikeW Dec 03 '20 at 14:51
  • Good for you mate. You don't have to run around spamming everyone's answers though... – Milan Cermak Dec 04 '20 at 15:16
  • Sorry you feel annoyed, @milan - I was in the middle of the issue myself, and thought I would add some extra info for the use of others. It was by no means "spam" or self-promotion. And I try to give credit where it's due. If you had new detail for a SO problem, I trust you would doing the same. – MikeW Dec 09 '20 at 09:19
0

Setting of PYTHONPATH variable is not required, as long as you place items correctly inside the zip file.

Simple modules, and package directories, these should be placed inside a directory "python", and then the whole python/ placed into the zip file for uploading to AWS as a layer. Don't forget to add the "compatible runtimes" (eg Python 3.6, 3.7, 3.8 ...) settings for the layers.

So as an example:

python/
- my_module.py
- my_package_dir
-- __init__.py
-- package_mod_1.py
-- package_mod_2.py

which then get included in the zip file.

zip -r my_layer_zip.zip python/

The modules can then be imported without any more ado, when accessed as a layer:

....
import my_module
from my_package.package_mod_2 import mod_2_function
....

You can see the package structure from within the lambda if you look at '/opt/python/' which will show my_module.py, my_package/ etc., this is easily tested using the AWS Lambda test function, assuming the layer is attached to the function (or else the code will error)

import json
import os

def lambda_handler(event, context):
    # TODO implement

    dir_list = os.listdir('/opt/python/')

    return {
        'statusCode': 200,
        'body': json.dumps('Hello from Lambda!'),
        'event': json.dumps(event),
        '/opt/python/': dir_list
    }
MikeW
  • 5,504
  • 1
  • 34
  • 29