4

WHAT I WANT TO DO

  • I have 3 buckets. I want to trigger a lambda function when an image uploaded to bucket1.
  • The lambda function will resize that image (500x500) and save that resized image in bucket2.
  • the main image on bucket1 will be backed up in bucket3.
  • the main image on bucket1 will be deleted then.

WHAT I HAVE DONE SO FAR

  • wrote a lambda function that will move images across the bucket.

  • Made S3 trigger to the lambda function

WHERE IS MY PROBLEM

  • I am using PIL to resize the image. But PIL module is not in python STL. So I zipped my code with site packages and able to run. But the error says file not found. an example of a file key image/myimage.jpg

  • Though I followed tutorials but if I try to use lambda layers instead of zipping everytime it seems not to find the PIL module.

CODE


import boto3
import os
import sys
import uuid
from PIL import Image
import PIL.Image

s3_client = boto3.client('s3')

def resize_image(image_path, resized_path):
    with Image.open(image_path) as image:
        image.thumbnail((500, 500))
        image.save(resized_path)

def lambda_handler(event, context):
    #
    # giving a key error here event['Records']
    #
    for record in event['Records']: 
        bucket = 'mahabubelahibucket1'
        key = record['s3']['object']['key'] 
        download_path = '/tmp/{}{}'.format(uuid.uuid4(), key)
        upload_path = '/tmp/resized-{}'.format(key)

        s3_client.download_file(bucket, key, download_path)
        resize_image(download_path, upload_path)
        s3_client.upload_file(upload_path, 'mahabubelahibucket2', key)
  • Did you forget to add the created layer in layers section of your lambda? – omuthu Mar 02 '20 at 12:51
  • Hi @omuthu No I did not. First I created a lambda layer and then add that layer by ARN in function. – Mahabub Elahi Shojib Mar 02 '20 at 19:08
  • Sounds a lot like: [Tutorial: Using AWS Lambda with Amazon S3](https://docs.aws.amazon.com/lambda/latest/dg/with-s3-example.html). The tutorial includes steps for creating a zip with all required libraries. – John Rotenstein Mar 03 '20 at 05:27

1 Answers1

10

This is my solution to the problem !

import boto3
import os
from PIL import Image
import pathlib
from io import BytesIO

s3 = boto3.resource('s3')

def delete_this_bucket(name):
    bucket = s3.Bucket(name)
    for key in bucket.objects.all():
        try:
            key.delete()
            bucket.delete()
        except Exception as e:
            print("SOMETHING IS BROKEN !!")

def create_this_bucket(name, location):
    try:
        s3.create_bucket(
            Bucket=name,
            CreateBucketConfiguration={
                'LocationConstraint': location
            }
        )
    except Exception as e:
        print(e)

def upload_test_images(name):
    for each in os.listdir('./testimage'):
        try:
            file = os.path.abspath(each)
            s3.Bucket(name).upload_file(file, each)
        except Exception as e:
            print(e)

def copy_to_other_bucket(src, des, key):
    try:
        copy_source = {
            'Bucket': src,
            'Key': key
        }
        bucket = s3.Bucket(des)
        bucket.copy(copy_source, key)
    except Exception as e:
        print(e)


def resize_image(src_bucket, des_bucket):
    size = 500, 500
    bucket = s3.Bucket(src_bucket)
    in_mem_file = BytesIO()
    client = boto3.client('s3')

    for obj in bucket.objects.all():
        file_byte_string = client.get_object(Bucket=src_bucket, Key=obj.key)['Body'].read()
        im = Image.open(BytesIO(file_byte_string))

        im.thumbnail(size, Image.ANTIALIAS)
        # ISSUE : https://stackoverflow.com/questions/4228530/pil-thumbnail-is-rotating-my-image
        im.save(in_mem_file, format=im.format)
        in_mem_file.seek(0)

        response = client.put_object(
            Body=in_mem_file,
            Bucket=des_bucket,
            Key='resized_' + obj.key
        )

def lambda_handler(event, context):
    bucket = s3.Bucket('myimagebucket0099')

    for obj in bucket.objects.all():
        copy_to_other_bucket(bucket, 'backupimagebucket0099', obj.key)

    resize_image(bucket.name, 'resizedimagebucket0099')


    print(bucket)