I am looking to run a Scrapy project in a docker container as part of a larger docker-compose application. My idea is to install Ubuntu base image, add all the dependencies and then get it going. Ideally I would like the container to continually run and when I want to run the Scrapy project I will run docker exec
into it. The long term goal would be to have the Scrapy as a scheduled task running every day.
How would I go about this?
I've tried: CMD ["/bin/sh"]
but the container exits straight away with code 0.