I am creating a lambda function to analyse a stock price and to upload the findings as a .csv file into my s3 bucket. And in order to do that, I'm using s3fs
. but I'm unable to import s3fs to my lambda function. The error message is
{
"errorMessage": "cannot import name 'resolve_checksum_context' from 'botocore.client' (/var/runtime/botocore/client.py)",
"errorType": "ImportError",
"requestId": "314d356c-c114-4c97-84a2-deb388f6cf8a",
"stackTrace": [
" File \"/var/task/lambda_function.py\", line 11, in lambda_handler\n import s3fs\n",
" File \"/mnt/packages/s3fs/__init__.py\", line 1, in <module>\n from .core import S3FileSystem, S3File\n",
" File \"/mnt/packages/s3fs/core.py\", line 20, in <module>\n import aiobotocore.session\n",
" File \"/mnt/packages/aiobotocore/session.py\", line 7, in <module>\n from .client import AioClientCreator, AioBaseClient\n",
" File \"/mnt/packages/aiobotocore/client.py\", line 2, in <module>\n from botocore.client import logger, PaginatorDocstring, ClientCreator, \\\n"
]
}
What have I done so far?
- Used lambda layer to load
s3fs
package.
zipped the required s3fs
package file and upload it as a layer (the same package had no error while I imported it to my virtual environment on my local machine). The error message was the same as above
the zipped file had the file structure mentioned by the AWS documentation. AWS Documentation
I even tried
sys.path.append('/opt/python/lib/python3.9/site-packages')
import s3fs
/opt
is where the files uploaded as layers would be present
- Mounted an EFS file system with the required files to my lambda function
Used EC instance to download all the packages into my EFS file system (This included all the other pip packages including boto3, NumPy, pandas etc). Those other packages I mention had no hiccups in being loaded into my lambda function.
sys.path.append("/mnt/packages")
import s3fs
/mnt/packages
is the local mount path of my EFS file system in lambda function.
Name: s3fs
Version: 0.4.2
Both the above methods showed the same error.
The same version of the package had no issues in my local machine, Virtual environment and in Docker Build. The error seems to be only in my AWS lambda function.
Since the ultimate goal is to read the .cvs file from the S3 and update the findings and re-upload the same file back. So, I would also be fine with knowing any other methods which would circumvent the use of s3fs.
{
"errorMessage": "Unable to import module 'lambda_function': Install s3fs to access S3",
"errorType": "Runtime.ImportModuleError",
"requestId": "aee2b9a8-699e-43eb-949a-2c0e427696aa",
"stackTrace": []
}
EDIT: I was able to upload using boto3. The solution can be found here here. But yeah, when using s3fs it did not work.