I've written a few Scrapy spiders that I want to run inside a django project when the user makes a request. As I was not able to get the spiders to work when they were inside of one project, I separated them into their own projects. I have been able to run a spider from a bash script successfully and thought that I could run that script inside a celery task.
I've seen posts like this but those don't use bash scripts to run the spiders and I also can't tell if they are running multiple spiders (not to mention that the code is probably deprecated at this point). This approach is of course just an idea, so I just want to find out what the proper way is to do this with the current versions of scrapy, django, and celery. Does it make sense to run them individually in a bash script, or is there another way to run multiple spiders within a celery task? I'm not really sure where to start here so any help would be appreciated, thanks.