0

This works fine on my local deployment but on the cloud deployement it does not.

with open(file_path, "wb+") as fp:
    for chunk in file:
        fp.write(chunk)

result = upload.delay(name, file_path)

In a different file:

@shared_task
def upload(name, file_path):

    path = Path(path_tmp)
    if os.path.isfile(path):
        do something

The error is

Not a path /mediafiles/rawfiles/file.png", FileNotFoundError: [Errno 2] No such file or directory

When I navigate in the docker to -> /mediafiles/rawfiles, the file is there and has a size.

I am using DRF -> Celery -> Django.

Can someone help why the cloud deployment is not able to find the file?

Joseph Adam
  • 1,341
  • 3
  • 21
  • 47

2 Answers2

1

When you delegate task to celery worker with @shared_task and try to get access to the filesystem, it try to get this file on the worker machine. So you need to execute this task on machine with a file or copy target file to worker machine.

GreyBit
  • 26
  • 3
  • Thank you so much for this clarification! This explains alot. Do you have any suggestions on how to copy that file from Django to the Celery Worker? – Joseph Adam Jun 03 '22 at 10:30
  • maybe you need to use something like `s3` or `minio` for storing file. So celery worker get file by path from it. I think you should look in this direction. – GreyBit Jun 03 '22 at 11:02
  • thanks you pointed me in the right direction, my question should have been ``how to share volumes between celery and django` and found the answer here https://stackoverflow.com/questions/44284484/docker-compose-share-named-volume-between-multiple-containers/44284993#44284993 – Joseph Adam Jun 03 '22 at 14:05
1

I add workdir to the start cmd, then it works well.

celery -A tasks --workdir=. worker ...

Hope that helps.