0

I'm trying to get 3 Jobs to run after each other in the sequence:

Job1 -> Job2 -> Job3

These 3 jobs are defined in operations.py:

def Job1(x):
    return x

def Job2(x):
    return x * x

def Job3(x):
    print(x)

I'm calling these jobs in script.py with the rq worker running:

from redis import Redis
from rq import Queue
from operation import Job1, Job2, Job3

redis_conn = Redis()
q = Queue(connection=redis_conn)

for num in [1,2,3,4,5,6,7,8]:
    j1 = q.enqueue(Job1, num)
    j2 = q.enqueue(Job2, j1.result, depends_on = j1)
    j3 = q.enqueue(Job3, depends_on = j2)

As per the documentation, I expect j3 to wait for j2 which in turn should wait for j1 to finish execution. But, this probably isnt happening. The jobs seem to be running async. I say so because the redis worker gives this as an error:

File "./operation.py", line 5, in Job2
    return x * x
TypeError: unsupported operand type(s) for *: 'NoneType' and 'NoneType'

j2 instead of waiting for j1's result is async also firing up and since by then j1's result isnt ready, j1.result is None which is passed to j2. Whats wrong with my approach? Why arent the jobs running sequentially?

xverges
  • 4,608
  • 1
  • 39
  • 60
user248884
  • 851
  • 1
  • 11
  • 21

1 Answers1

0

When you are adding the jobs to the queue, j1.result is None: it will only have a different value once a worker completes its execution.

I think that you would need to pass j1.id to Job2, and do something similar to

def Job2(job1_id):
  from rq.job import Job
  job1 = Job.fetch(job1_id)
  x = job1.result
  return x * x

More info at http://python-rq.org/docs/jobs/#retrieving-a-job-from-redis

Edit: This does it more cleanly, without passing around the id https://stackoverflow.com/a/37713756/239408

xverges
  • 4,608
  • 1
  • 39
  • 60