I have a python script which takes video and converts it to a series of small panoramas. Now, theres an S3 bucket where a video will be uploaded (mp4). I need this file to be sent to the ec2 instance whenever it is uploaded. This is the flow:
- Upload video file to S3.
- This should trigger EC2 instance to start.
- Once it is running, I want the file to be copied to a particular directory inside the instance.
- After this, I want the py file (panorama.py) to start running and read the video file from the directory and process it and then generate output images.
- These output images need to be uploaded to a new bucket or the same bucket which was initially used.
- Instance should terminate after this.
What I have done so far is, I have created a lambda function that is triggered whenever an object is added to that bucket. It stores the name of the file and the path. I had read that I now need to use an SQS queue and pass this name and path metadata to the queue and use the SQS to trigger the instance. And then, I need to run a script in the instance which pulls the metadata from the SQS queue and then use that to copy the file(mp4) from bucket to the instance. How do i do this? I am new to AWS and hence do not know much about SQS or how to transfer metadata and automatically trigger instance, etc.