Dask workers maintain a single thread pool that they use to launch tasks. Each task always consumes one thread from this pool. You can not tell a task to take many threads from this pool.
However, there are other ways to control and restrict concurrency within dask workers. In your case you might consider defining worker resources. This would let you stop many big tasks from running at the same time on the same workers.
In the following example we define that each worker has one Foo
resource and that each task requires one Foo
to run. This will stop any two tasks from running concurrently on the same worker.
dask-worker scheduler-address:8786 --resources Foo=1
dask-worker scheduler-address:8786 --resources Foo=1
.
from dask.distributed import Client
client = Client('scheduler-address:8786')
futures = client.map(my_expensive_function, ..., resources={'Foo': 1})