15

Like many others before me, I'm trying to run an AWS Lambda function and when I try to test it, I get

"errorMessage": "Unable to import module 'lambda_function'"

My Handler is set to lambda_function.lambda_handler, and I indeed have a file named lambda_function.py which contains a function called lambda_handler. Here's a screenshot as proof: enter image description here

Everything was working fine when I was writing snippets of code inline in the included IDE, but when I zipped my full program with all of its dependencies and uploaded it, I got the above error.

I'm using the Numpy and Scipy packages, which are quite large. My zipped directory is 34 MB, and my unzipped directory 122 MB. I think this should be fine since the limit is 50 MB for a zipped directory. It appears to be uploading fine, since I see the message:

The deployment package of your Lambda function "one-shot-image-classification" is too large to enable inline code editing. However, you can still invoke your function right now.

I've seen that some posts solve this by using virtualenv, but I'm not familiar with that technology and I'm not sure how to use it properly.

I've also seen some posts saying that sometimes dependencies have dependencies and I may need to include those, but I'm not sure how to find this out.

Here's the top portion of lambda_function.py, which should be enough to see the libraries I'm using and that I do indeed have a lambda_handler function:

import os
import boto3
import numpy as np
from scipy.ndimage import imread
from scipy.spatial.distance import cdist

def lambda_handler(event, context):

    s3 = boto3.resource('s3')

Here a screenshot of the unzipped version of the directory I'm uploading: enter image description here

I can also post the policy role that my Lambda is using if that could be an issue.

Any insight is much appreciated!

UPDATE:

Here's one solution I tried: 1. git clone https://github.com/Miserlou/lambda-packages 2. create a folder in Documents called new_lambda 3. copy my lambda_function.py and the numpy folder from the lambda-packages into new_lambda, along with the scipy library that I compiled using Docker for AWS as per the article: https://serverlesscode.com/post/scikitlearn-with-amazon-linux-container/ 4. Zip the new_lambda folder by right-clicking it and selecting 'compress'

My results:

Unable to import module 'lambda_function': No module named 'lambda_function'

To reiterate, my file is named lambda_function.py and contains a function called lambda_handler, which accepts two arguments (as seen above). This information matches that seen in Handler, also seen above.

I am using a Mac computer, if that matters.

UPDATE 2

If I follow the above steps but instead zip the files by directly selecting the files that I want to compress and then right clicking and selecting 'compress', I instead get the error

Unable to import module 'lambda_function': cannot import name 'show_config'

Also, the precompiled lambda-packages says that they are compiled for "at least Python 2.7", but my lambda runtime is 3.6. Could this be an issue?

Spencer Goff
  • 1,036
  • 3
  • 14
  • 23
  • In my experience this is a naming issue with the handler function, but your naming seems fine. Double check after zipping that if your handler is foo.bar, then your lambda function file is foo.py with a function bar() inside? – Andrew Zick Apr 09 '18 at 14:14
  • 1
    As seen in my question, I have a file named lambda_function.py which has a function (shown above) named lambda_handler. I'll update the question with a screenshot of my Handler setting. Thanks for looking! – Spencer Goff Apr 09 '18 at 14:17
  • 1
    @SpencerGoff also check if the `lambda_function.py` is in the root of the deployed ZIP and not inside the `One-Shot-Learning-Lambda` folder – laika Apr 09 '18 at 14:31
  • Good idea, I'm trying that now. I just realized I've been misunderstanding the use of the term "root", so that's probably my issue. Let's see. – Spencer Goff Apr 09 '18 at 14:34
  • Okay, now I'm getting the same error plus "no module named numpy". Probably because that's also within the one-shot-learning folder. I'll try to fix that now... – Spencer Goff Apr 09 '18 at 14:38
  • I flattened the directory structure (see updated question) but I'm still getting "Unable to import module 'lambda_function'". The name of the file I'm uploading is Archive.zip, if that matters? – Spencer Goff Apr 09 '18 at 14:47
  • I just re-zipped everything and uploaded, now getting: Unable to import module 'lambda_function': Importing the multiarray numpy extension module failed. Most likely you are trying to import a failed build of numpy. If you're working with a numpy git repo, try `git clean -xdf` (removes all files not under version control). Otherwise reinstall numpy. Original error was: cannot import name 'multiarray' – Spencer Goff Apr 09 '18 at 15:27

6 Answers6

11

The problem is that your local numpy and pandas are compiled for the local machine's architecture. Since AWS Lambda uses custom Linux, they are probably not compatible.

So if you want to use them, you have two choices:

  • Compile dependencies on EC2 instance which uses the same Amazon Linux version as AWS Lambda and create a deployment package.

  • Use one of the precompiled packages from here

P.S. I've read comments on a post, so I see that the name of the file and function is ok and numpy is giving you trouble.

ljmocic
  • 1,677
  • 2
  • 18
  • 23
  • Thanks for the response, I've updated the question with trying your second option. I'll try your first option if I can't get this to work. – Spencer Goff Apr 15 '18 at 18:35
3

The solution was zipping numpy and scipy precompiled packages from this source.

Spencer Goff
  • 1,036
  • 3
  • 14
  • 23
3

I had a similar issue:

Unable to import module 'lib/lambda_function': No module named 'lib/lambda_function'

The fix for me and possibly for you, was to include a blank __init__.py in the same directory as lambda_function.py.

Why does __init__.py fix the issue?

I understand that it is needed for the directory (lib in my case, . in yours) to be considered a valid Python package.

Here is the reference doc I based that hypothesis on: 5.2.1. Regular packages - Python 3.7.3 documentation

Alain O'Dea
  • 21,033
  • 1
  • 58
  • 84
1

I ran into this issue on MacOS as well. I see you mention the way you select your files affected whether it worked properly. Turns out this is true!

On Mac, if your .DS_Store/MacOS hidden folder sneaks into the directory it seems to break Lambda!

The solution is to

rm .DS_Store

In the deployment zip folder.

dnola
  • 61
  • 5
1

I was receiving this same error, but the cause of the error was different. Adding an answer here in case some other weary StackOverflow wanderer finds it helpful.

In my case, I was attempting to upload the below directory where package is a sample python package dependency and my function code is enclosed in lambda_function.py: enter image description here

I was zipping the entire function directory, which was resulting in the following file structure when deployed to lambda: enter image description here

In order to run properly, both lambda_function.py and the package directory should be in the top-level lambda directory. In my case ConfigureAppFlow. The function directory is an extra layer that is causing the error.

To fix this, instead of compressing my function directory, I directly compressed the two items inside:enter image description here

This resulted in the following file structure when deployed to lambda (ignore the _MACOSX folder):enter image description here

In summary, I'm sure there are a ton of different causes of this issue, but the first thing to check is that the zip file you are uploading to lambda correctly results in a file structure that places lambda_function.py and any dependent packages in your top-level lambda directory.

Daniel Long
  • 1,162
  • 14
  • 30
0

The way compressing files works seems to be the issue here, I just solved the same problem with my code files. Simply highlight the files/folders you want to compress and right-click and compress that.

For whatever reason, I keep getting an additional folder inside my initial project folder which throw off AWS as it doesn't know to go 2 directories deep but only looks at the root directory for our files.