2
@app.task(rate_limit='1/s')
def task1():
    print ('hi')

@app.task(rate_limit='1/s')
def task2():
    print ('hello')

This code will print 1 hi/sec and 1 hello/sec. This is not what I want to do. I want to let it print 1 (hi+hello)/sec, which means giving rate limit not to each task but over multiple tasks.

It can be (0.5 hi + 0.5 hello)/sec or (0.7 hi + 0.3 hello)/sec. It depends on the rate of the tasks requested. However only one of two tasks should be done in one second.

JoshMc
  • 10,239
  • 2
  • 19
  • 38
sokancho
  • 35
  • 4
  • How about making a single task and rate it to 1/sec then dispatch the action depending on the parameter? – hurturk Mar 17 '17 at 03:11
  • That's what I want to avoid because of serialization issue. Maybe I can't jjust pass a lambda object to tasks. Is that's the only option? – sokancho Mar 17 '17 at 04:06
  • You don't have to pass lambda or complex objects, you can just use strings for function names to be called in same module, like "task2" and so on. Here is an [example](http://stackoverflow.com/a/3071/1233686). Not sure if there are other ways around, but celery [doc](http://docs.celeryproject.org/en/latest/userguide/tasks.html#Task.rate_limit) states that this limit is only for worker itself, not global. For global, you can consider limiting the queue itself. – hurturk Mar 17 '17 at 04:12
  • Can I ask what the serialization issues are? Any parameter you pass to either of the tasks is going to need to be serialized anyway, I believe. – Jenner Felton Mar 17 '17 at 04:25
  • I was saying that labmda object cannot be serialize to json, so I can't help creating a json object contining the function name and arguments to pass. I just wanted an elegant solution. But it seems there isn't. – sokancho Mar 17 '17 at 04:30
  • Here's an implementation that rolls its own rate limiting: https://gist.github.com/Vigrond/2bbea9be6413415e5479998e79a1b11a. You just need to share the redis keys between your two tasks and they'll be on the same rate limit I believe. – DylanYoung Oct 29 '19 at 17:26
  • There are two other alternatives: (1) Modify Celery to support this (it's perfectly possible to create a per-worker rate limit that respects many tasks without any global state)... You *may* even be able to do this with just a custom Task subclass, though you'd have to dig into the internals to see if this is feasible or (2) as you've suggested create a dispatcher task, this could be quite simple or relatively complex depending on your needs (if your example is accurate, it should be *very* simple, but I suspect it's a simplified example that doesn't really represent your real need) – DylanYoung Oct 29 '19 at 17:36

0 Answers0